mcp-agent
A modular Python framework implementing the Model Context Protocol (MCP). It features a standardized client-server architecture over StdIO, integrating LLMs with external tools, real-time weather data fetching, and an advanced RAG (Retrieval-Augmented Generation) system.
claude mcp add --transport stdio haohao-end-mcp-agent python rag_server.py \ --env MODEL="qwen-plus" \ --env API_KEY="your_llm_api_key" \ --env BASE_URL="https://dashscope.aliyuncs.com/compatible-mode/v1" \ --env EMBED_MODEL="sentence-transformers/all-MiniLM-L6-v2"
How to use
The MCP Agent Orchestrator implements the Model Context Protocol (MCP) to coordinate multiple specialized MCP servers. In this repository, there are two server implementations: a Weather service and a RAG (Retrieval-Augmented Generation) knowledge server. The agent (client) discovers available tools from these servers, presents them to the LLM via a JSON function schema, and executes tool calls as the model proposes them. To use the system, first ensure the Weather server and the RAG server are running, then run the MCP client scripts (e.g., rag_agent.py or client.py) to establish a session and begin interactive tool usage. The client handles asynchronous lifecycles, converts tool schemas to OpenAI-compatible function calls, and maintains conversation state across turns for multi-step reasoning.
How to install
Prerequisites:
- Python 3.10+ installed on your system
- A virtual environment is recommended
- Access to external APIs used by the servers (e.g., WeatherAPI) and your LLM API key
Installation steps:
-
Clone the repository git clone <repository-url> cd mcp-agent
-
Create and activate a virtual environment python -m venv venv
On Windows
venv\Scripts\activate.bat
On macOS/Linux
source venv/bin/activate
-
Install dependencies pip install mcp langchain langchain-community langchain-openai chromadb httpx python-dotenv openai
-
Create and configure environment variables Create a .env file with contents similar to: API_KEY=your_llm_api_key BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 MODEL=qwen-plus EMBED_MODEL=sentence-transformers/all-MiniLM-L6-v2
-
Run the MCP servers (examples)
Weather server
python server.py
RAG knowledge server
python rag_server.py
-
Run the MCP client/agent to connect to servers python rag_agent.py --server_script rag_server.py or python client.py server.py
Additional notes
Environment and runtime tips:
- Ensure the .env file is loaded when running the servers to provide API keys and model configuration.
- If you encounter connectivity issues between the client and servers, verify that the server scripts are running and listening on expected interfaces.
- The RAG server relies on LangChain, ChromaDB, and embedding models; ensure data directories (e.g., data/rag_db) exist and contain or can create the required indices.
- For OpenAI-compatible function calling, the client converts server tool definitions into JSON schemas; keep the tool manifests up to date in your server implementations.
- If you modify environment variables, restart the affected services to apply changes.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
mcp-aktools
📈 提供股票、加密货币的数据查询和分析功能MCP服务器
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
BinAssistMCP
Binary Ninja plugin to provide MCP functionality.
mcp -docy
A Model Context Protocol server that provides documentation access capabilities. This server enables LLMs to search and retrieve content from documentation websites by scraping them with crawl4ai. Built with FastMCP v2.
the -company
TheMCPCompany: Creating General-purpose Agents with Task-specific Tools