python -client
支持查询主流agent框架技术文档的MCP server(支持stdio和sse两种传输协议), 支持 langchain、llama-index、autogen、agno、openai-agents-sdk、mcp-doc、camel-ai 和 crew-ai
claude mcp add --transport stdio gobinfan-python-mcp-server-client uv --directory <path-to-your-project> run main.py
How to use
This MCP server implements a Python-based MCP server/client setup that enables large language models to access structured documents and web content through a unified MCP interface. The server exposes a tool-enabled flow that can fetch and aggregate documentation from various libraries (e.g., LangChain, llama-index, AutoGen, and MCP docs) by querying sites and extracting text for use in downstream reasoning. Clients can connect via the Stdio transport (local usage) or via SSE for remote deployments, and can be controlled through the MCP runtime using common commands such as starting the server with uv run main.py. Tools are exposed via a get_docs function that takes a query and a library, validates the library, performs a site-specific search, and returns extracted text from relevant pages. This enables the model to reference up-to-date documentation and API surfaces when answering user questions.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- uv (UvS or UV CLI) installed or available in PATH
- Basic familiarity with Python packaging and virtual environments
- Install UV (if not already installed):
- macOS/Linux: curl -LsSf https://astral.sh/uv/install.sh | sh
- Windows: powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
- Create and activate a project directory for the MCP server client:
- uv init gobinfan-python-mcp-server-client
- cd gobinfan-python-mcp-server-client
- uv venv
- source .venv/bin/activate # Windows: .venv\Scripts\activate
- Install MCP CLI and dependencies:
- uv add "mcp[cli]" httpx
-
Prepare your server script (main.py) in the project root. This repository provides the MCP server/client code as described in the README. Ensure main.py contains the MCP server setup (stdio transport) and exposes the get_docs tool if using the provided example.
-
Run the server:
- uv run main.py
Note: If you prefer using a boilerplate without UV, you can adapt the server to a standard Python environment by installing required dependencies from the project (for example, pip install httpx beautifulsoup4). The MCP server will listen for commands via the configured transport (stdio by default in the examples).
Additional notes
Tips and common considerations:
- For local development, using the stdio transport is simplest. For cloud or remote deployments, consider SSE transport and a compatible hosting setup.
- You may need an API key for SERPER (Google search) if you enable the web-doc tooling (SERPER_API_KEY in environment). Ensure to set SERPER_API_KEY in your environment when running the server if you enable doc searching.
- The get_docs tool relies on network access to fetch and parse documentation pages; occasional timeouts may occur due to network or site protections. The sample code handles timeouts gracefully.
- If you modify docs URLs or libraries, update the docs_urls mapping accordingly and ensure the library key exists in the validation step.
- When configuring MCP clients (e.g., Cline or Cursor), provide the correct directory path to your project and ensure the Python script (main.py) can be launched with the specified command sequence.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
multimodal-agents-course
An MCP Multimodal AI Agent with eyes and ears!
AgentChat
AgentChat 是一个基于 LLM 的智能体交流平台,内置默认 Agent 并支持用户自定义 Agent。通过多轮对话和任务协作,Agent 可以理解并协助完成复杂任务。项目集成 LangChain、Function Call、MCP 协议、RAG、Memory、Milvus 和 ElasticSearch 等技术,实现高效的知识检索与工具调用,使用 FastAPI 构建高性能后端服务。
mcp-use-ts
mcp-use is the framework for MCP with the best DX - Build AI agents, create MCP servers with UI widgets, and debug with built-in inspector. Includes client SDK, server SDK, React hooks, and powerful dev tools.
mcp-in-action
极客时间MCP新课已经上线!超2000同学一起开启MCP学习之旅!
openai -agent-dotnet
Sample to create an AI Agent using OpenAI models with any MCP server running on Azure Container Apps