fastchat
fastchat-mcp is a very simple way to interact with MCP servers using custom chats through natural language.
claude mcp add --transport stdio rb58853-fastchat-mcp python -m fastchat_mcp \ --env OPENAI_API_KEY="OpenAI API key" \ --env CRIPTOGRAFY_KEY="Cryptographic key for token data storage"
How to use
Fastchat MCP provides a Python-based client that connects to MCP servers using multiple transfer protocols (primarily stdio and HTTPStream). It is designed to work with integrated language models (notably OpenAI-compatible models) and offers a modular interface to configure and communicate with MCP servers. The project exposes a command-line client via the mcp[cli] integration, enabling you to interact with configured MCP servers, send prompts, and stream responses through standard protocols. You can leverage its configuration to route conversations through a chosen server, manage model selection, and handle streaming or standard I/O interactions as supported by the backend server.
How to install
Prerequisites:
- Python 3.11 or newer
- Access to install Python packages (pip)
- OpenAI API key (if using OpenAI models)
Installation steps:
- Create and activate a Python environment (optional but recommended):
python -m venv venv
source venv/bin/activate # on Unix/macOS
venv\Scripts\activate # on Windows
- Install the MCP client package:
pip install fastchat-mcp
- Prepare required environment variables (example):
export OPENAI_API_KEY=your-openai-key
export CRIPTOGRAFY_KEY=your-cryptography-key
- Run the MCP server client (as configured in mcp_config):
python -m fastchat_mcp
(Adjust as needed if you use a different entry point or environment setup.)
Additional notes
Tips and notes:
- Ensure Python 3.11+ is used to meet the dependencies of fastchat-mcp.
- Set OPENAI_API_KEY in your environment if you plan to access OpenAI models; the project notes indicate OpenAI provider support (default provider).
- The CRIPTOGRAFY_KEY is used for token data storage and related cryptographic operations; keep it secure and do not expose in logs.
- The MCP configuration file fastchat.config.json should be placed at the repository root and define your available MCP servers (protocols: httpstream or stdio).
- The project currently emphasizes stdio and HTTPStream protocols; SSE is not supported.
- If you encounter issues, verify that the environment variables are loaded in the same process that runs the MCP client and that network access to OpenAI endpoints is allowed.
- For debugging, run with verbose logging if available in the CLI to inspect MCP server connection details and protocol handshakes.
Related MCP Servers
fastmcp
🚀 The fast, Pythonic way to build MCP servers and clients.
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
nerve
The Simple Agent Development Kit.
python -client
支持查询主流agent框架技术文档的MCP server(支持stdio和sse两种传输协议), 支持 langchain、llama-index、autogen、agno、openai-agents-sdk、mcp-doc、camel-ai 和 crew-ai
zin -client
MCP Client which serves as bridge between mcp servers and local LLMs running on Ollama, Created for MCP Servers Developed by Me, However other MCP Servers may run as well
lc2mcp
Convert LangChain tools to FastMCP tools