jiki
MCP server from teilomillet/jiki
claude mcp add --transport stdio teilomillet-jiki python servers/calculator_server.py
How to use
Jiki is a Python-based framework that wires an LLM to external tools by launching tool servers that speak MCP over stdio. It uses an Orchestrator to manage the conversation flow and a JikiClient to communicate with the MCP tool servers. With the calculator server example, Jiki auto-discovers the available tools from the MCP server and then routes calculation tasks to the appropriate tool, letting the LLM request operations like arithmetic while the server handles the actual computation. You can run an interactive chat, process a single query, or use Jiki programmatically to embed tool-enabled reasoning into your applications.
How to install
Prerequisites:
- Python 3.8+ (virtual environments recommended)
- Basic familiarity with Python packaging and CLI usage
Installation steps:
- Create and activate a virtual environment (optional but recommended): python3 -m venv venv source venv/bin/activate # on Unix/macOS .\venv\Scripts\activate # on Windows
- Install the Jiki package:
pip install jiki
or, if you prefer faster installation in some environments, you can use uv as the runtime for Python tooling
uv add jiki - (Optional) Set up an API key for your chosen LLM provider (LiteLLM is used internally by Jiki):
export ANTHROPIC_API_KEY=your_key_here
or
export OPENAI_API_KEY=your_key_here
- Ensure you have your MCP server script available. For the example, the repository path should include servers/calculator_server.py.
- Run and test an example using the calculator MCP server (see the README for the exact commands in the Quick Start): python -m jiki.cli run --auto-discover --mcp-script-path servers/calculator_server.py
Note: The example uses a Python script as the MCP server. If you later replace the script with another tool server, ensure the mcp_script_path points to the new script and its discovery protocol remains compatible with MCP.
Additional notes
Tips and common notes:
- The MCP server (Python-based in this setup) communicates with Jiki via stdio. Ensure the script at servers/calculator_server.py is executable and reachable from your working directory.
- If tool discovery fails, verify that --mcp-script-path is correct and the MCP server starts without errors (check logs or console output).
- Set the appropriate environment variables for your chosen LLM provider (e.g., ANTHROPIC_API_KEY or OPENAI_API_KEY) before starting Jiki.
- For debugging, try the interactive chat workflow first to confirm that tool discovery and calls are functioning before attempting batch processing.
- If you switch to a different MCP server script, you may need to adjust the mcp_config to point to the new script path and ensure the new server’s tool definitions are compatible with Jiki’s discovery protocol.
Related MCP Servers
nerve
The Simple Agent Development Kit.
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.
mcp-aoai-web-browsing
A minimal Model Context Protocol 🖥️ server/client🧑💻with Azure OpenAI and 🌐 web browser control via Playwright.
mcp-manager
CLI tool for managing Model Context Protocol (MCP) servers in one place & using them across them different clients
mcp-community
Easily run, deploy, and connect to MCP servers