Get the FREE Ultimate OpenClaw Setup Guide →

DeepMCPAgent

Model-agnostic plug-n-play LangChain/LangGraph agents powered entirely by MCP tools over HTTP/SSE.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio cryxnet-deepmcpagent python -m deepmcpagent \
  --env OPENAI_API_KEY="Your OpenAI API key if using OpenAI models" \
  --env DEEPMCPAGENT_HTTP_TIMEOUT="HTTP timeout in seconds (optional)"

How to use

DeepMCPAgent is a Python-based framework that powers model-agnostic agents by discovering and invoking MCP-enabled tools over HTTP/SSE. The server itself acts as a runtime for building and running LangChain/LangGraph agents that you can connect to any MCP server. You can start by running a sample MCP server from the repository’s examples or-connect to remote MCP servers to fetch and call tools dynamically without hardcoding tool integrations. The CLI and BYOM capabilities allow you to test and deploy agents quickly, using any LangChain-compatible model, while DeepMCPAgent handles tool validation and JSON-Schema based parameters.

With DeepMCPAgent, tools are discovered at runtime via MCP endpoints, and tool calls are validated against typed schemas for arguments. If you have DeepAgents installed, you get an enhanced loop with automatic planning and execution; otherwise you’ll rely on a robust LangGraph ReAct fallback. The Cross-Agent Communication feature enables agents to function as peers, delegating tasks to other agents through MCP as if they were tools, enabling collaborative reasoning flows without extra orchestration.

How to install

Prerequisites:

  • Python 3.10 or newer
  • pip (comes with Python)

Install the package from PyPI (recommended):

pip install "deepmcpagent[deep]"

Optional extras:

  • dev: linting, typing, tests
  • docs: MkDocs + Material + mkdocstrings
  • examples: dependencies used by bundled examples

Install with extra tooling (e.g., DeepAgents + dev tooling):

pip install "deepmcpagent[deep,dev]"

If you’re using zsh, remember to quote extras:

pip install "deepmcpagent[deep,dev]"

To run a quick local MCP server example (from repository examples):

python examples/servers/math_server.py

This will serve an MCP endpoint (for example at http://127.0.0.1:8000/mcp) that your agent can discover and use.

Additional notes

Tips and notes:

  • DeepMCPAgent discovers tools dynamically from MCP servers over HTTP/SSE; you don’t need to wire tools manually.
  • You can connect to remote MCP servers by supplying headers for authentication when starting the agent or via the CLI.
  • If you plan to use external APIs (e.g., OpenAI), set corresponding environment variables such as OPENAI_API_KEY before running the agent.
  • The BYOM section shows how to pass any LangChain-compatible model to the agent; you can supply a model instance or a provider string like "openai:gpt-4.1" and let the library initialize it from your environment.
  • The CLI supports multiple --http blocks to query several MCP endpoints in one session; you can attach headers using header.X=Y pairs.
  • For best results, ensure your MCP servers expose a well-typed tool schema (JSON-Schema → Pydantic) so arguments are validated before tool invocation.

Related MCP Servers

Sponsor this space

Reach thousands of developers