DeepMCPAgent
Model-agnostic plug-n-play LangChain/LangGraph agents powered entirely by MCP tools over HTTP/SSE.
claude mcp add --transport stdio cryxnet-deepmcpagent python -m deepmcpagent \ --env OPENAI_API_KEY="Your OpenAI API key if using OpenAI models" \ --env DEEPMCPAGENT_HTTP_TIMEOUT="HTTP timeout in seconds (optional)"
How to use
DeepMCPAgent is a Python-based framework that powers model-agnostic agents by discovering and invoking MCP-enabled tools over HTTP/SSE. The server itself acts as a runtime for building and running LangChain/LangGraph agents that you can connect to any MCP server. You can start by running a sample MCP server from the repository’s examples or-connect to remote MCP servers to fetch and call tools dynamically without hardcoding tool integrations. The CLI and BYOM capabilities allow you to test and deploy agents quickly, using any LangChain-compatible model, while DeepMCPAgent handles tool validation and JSON-Schema based parameters.
With DeepMCPAgent, tools are discovered at runtime via MCP endpoints, and tool calls are validated against typed schemas for arguments. If you have DeepAgents installed, you get an enhanced loop with automatic planning and execution; otherwise you’ll rely on a robust LangGraph ReAct fallback. The Cross-Agent Communication feature enables agents to function as peers, delegating tasks to other agents through MCP as if they were tools, enabling collaborative reasoning flows without extra orchestration.
How to install
Prerequisites:
- Python 3.10 or newer
- pip (comes with Python)
Install the package from PyPI (recommended):
pip install "deepmcpagent[deep]"
Optional extras:
- dev: linting, typing, tests
- docs: MkDocs + Material + mkdocstrings
- examples: dependencies used by bundled examples
Install with extra tooling (e.g., DeepAgents + dev tooling):
pip install "deepmcpagent[deep,dev]"
If you’re using zsh, remember to quote extras:
pip install "deepmcpagent[deep,dev]"
To run a quick local MCP server example (from repository examples):
python examples/servers/math_server.py
This will serve an MCP endpoint (for example at http://127.0.0.1:8000/mcp) that your agent can discover and use.
Additional notes
Tips and notes:
- DeepMCPAgent discovers tools dynamically from MCP servers over HTTP/SSE; you don’t need to wire tools manually.
- You can connect to remote MCP servers by supplying headers for authentication when starting the agent or via the CLI.
- If you plan to use external APIs (e.g., OpenAI), set corresponding environment variables such as OPENAI_API_KEY before running the agent.
- The BYOM section shows how to pass any LangChain-compatible model to the agent; you can supply a model instance or a provider string like "openai:gpt-4.1" and let the library initialize it from your environment.
- The CLI supports multiple --http blocks to query several MCP endpoints in one session; you can attach headers using header.X=Y pairs.
- For best results, ensure your MCP servers expose a well-typed tool schema (JSON-Schema → Pydantic) so arguments are validated before tool invocation.
Related MCP Servers
composio
Composio powers 1000+ toolkits, tool search, context management, authentication, and a sandboxed workbench to help you build AI agents that turn intent into action.
bytebot
Bytebot is a self-hosted AI desktop agent that automates computer tasks through natural language commands, operating within a containerized Linux desktop environment.
aci
ACI.dev is the open source tool-calling platform that hooks up 600+ tools into any agentic IDE or custom AI agent through direct function calling or a unified MCP server. The birthplace of VibeOps.
sre
The SmythOS Runtime Environment (SRE) is an open-source, cloud-native runtime for agentic AI. Secure, modular, and production-ready, it lets developers build, run, and manage intelligent agents across local, cloud, and edge environments.
tools
A set of tools that gives agents powerful capabilities.
samples
Agent samples built using the Strands Agents SDK.