Get the FREE Ultimate OpenClaw Setup Guide →

mcp-agent

A modular Python framework implementing the Model Context Protocol (MCP). It features a standardized client-server architecture over StdIO, integrating LLMs with external tools, real-time weather data fetching, and an advanced RAG (Retrieval-Augmented Generation) system.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio haohao-end-mcp-agent python rag_server.py \
  --env MODEL="qwen-plus" \
  --env API_KEY="your_llm_api_key" \
  --env BASE_URL="https://dashscope.aliyuncs.com/compatible-mode/v1" \
  --env EMBED_MODEL="sentence-transformers/all-MiniLM-L6-v2"

How to use

The MCP Agent Orchestrator implements the Model Context Protocol (MCP) to coordinate multiple specialized MCP servers. In this repository, there are two server implementations: a Weather service and a RAG (Retrieval-Augmented Generation) knowledge server. The agent (client) discovers available tools from these servers, presents them to the LLM via a JSON function schema, and executes tool calls as the model proposes them. To use the system, first ensure the Weather server and the RAG server are running, then run the MCP client scripts (e.g., rag_agent.py or client.py) to establish a session and begin interactive tool usage. The client handles asynchronous lifecycles, converts tool schemas to OpenAI-compatible function calls, and maintains conversation state across turns for multi-step reasoning.

How to install

Prerequisites:

  • Python 3.10+ installed on your system
  • A virtual environment is recommended
  • Access to external APIs used by the servers (e.g., WeatherAPI) and your LLM API key

Installation steps:

  1. Clone the repository git clone <repository-url> cd mcp-agent

  2. Create and activate a virtual environment python -m venv venv

    On Windows

    venv\Scripts\activate.bat

    On macOS/Linux

    source venv/bin/activate

  3. Install dependencies pip install mcp langchain langchain-community langchain-openai chromadb httpx python-dotenv openai

  4. Create and configure environment variables Create a .env file with contents similar to: API_KEY=your_llm_api_key BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1 MODEL=qwen-plus EMBED_MODEL=sentence-transformers/all-MiniLM-L6-v2

  5. Run the MCP servers (examples)

    Weather server

    python server.py

    RAG knowledge server

    python rag_server.py

  6. Run the MCP client/agent to connect to servers python rag_agent.py --server_script rag_server.py or python client.py server.py

Additional notes

Environment and runtime tips:

  • Ensure the .env file is loaded when running the servers to provide API keys and model configuration.
  • If you encounter connectivity issues between the client and servers, verify that the server scripts are running and listening on expected interfaces.
  • The RAG server relies on LangChain, ChromaDB, and embedding models; ensure data directories (e.g., data/rag_db) exist and contain or can create the required indices.
  • For OpenAI-compatible function calling, the client converts server tool definitions into JSON schemas; keep the tool manifests up to date in your server implementations.
  • If you modify environment variables, restart the affected services to apply changes.

Related MCP Servers

Sponsor this space

Reach thousands of developers