mcp-chatbot
A simple CLI chatbot that demonstrates the integration of the Model Context Protocol (MCP).
claude mcp add --transport stdio 3choff-mcp-chatbot uvx mcp-server-sqlite --db-path ./test.db
How to use
This MCP Chatbot demonstrates how to compose an interactive CLI assistant that leverages multiple MCP servers as dynamic tool providers. Tools from configured MCP servers are discovered at runtime and exposed to the language model through a system prompt, enabling the LLM to decide when to call a tool and which one to use. The client (Python) handles tool discovery, tool formatting for the LLM, and tool execution, while the LLM interprets responses and decides whether to perform a tool call or return a direct answer. You can interact with the chatbot by starting the client and asking questions that may be augmented by the available tools, such as querying data from a database or running a headless browser task via the Puppeteer service.
To use it, first ensure your environment is configured, then run the client (python main.py). The assistant will automatically detect the tools exposed by the configured servers and incorporate them into its available capabilities. When you ask a question that requires a tool, the LLM will indicate a tool call, the client will execute the tool, and the result will be fed back to the LLM for final presentation to you.
How to install
Prerequisites:
- Python 3.10+ (tested with Python 3.10)
- pip (Python package installer)
- git
- Optional: a compatible LLM provider key (e.g., OpenAI-compatible API key)
Installation steps:
-
Clone the repository and install dependencies
git clone https://github.com/3choff/mcp-chatbot.git cd mcp-chatbot -
Create a virtual environment (recommended) and install requirements
python -m venv venv # Windows venv\Scripts\activate # macOS/Linux source venv/bin/activate pip install -r requirements.txt -
Set up environment variables Create a .env file in the project root with your API key(s). Example:
LLM_API_KEY=your_api_key_here -
Configure MCP servers Edit or create servers_config.json to define the MCP servers you want to use. Example provided in README:
{ "mcpServers": { "sqlite": { "command": "uvx", "args": ["mcp-server-sqlite", "--db-path", "./test.db"] }, "puppeteer": { "command": "npx", "args": ["-y", "@modelcontextprotocol/server-puppeteer"] } } }You may add environment variables under each server using the env field as needed.
-
Run the client
python main.py
Additional notes
Tips and considerations:
- The system prompt auto-includes tool descriptions for LLM awareness; ensure your MCP servers expose recognizable tools for best results.
- If a tool call fails, check that the corresponding MCP server is running and reachable, and inspect any environment variables or command-line arguments required by that server.
- You can customize environment variables per server in servers_config.json by adding an env object, e.g., "env": { "API_KEY": "your_api_key_here" }
- The project is Python-based and uses uvicorn-compatible MCP servers; if you switch to a different server implementation, ensure it adheres to MCP conventions so the tool discovery and formatting work correctly.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP