langgraph-ai
LangGraph AI Repository
claude mcp add --transport stdio piyushagni5-langgraph-ai python -m langgraph_ai.mcp_server \ --env ANTHROPIC_API_KEY="your-anthropic-api-key" \ --env LANGCHAIN_API_KEY="optional-langchain-api-key" \ --env LANGCHAIN_PROJECT="optional-project-name" \ --env LANGCHAIN_TRACING_V2="optional-enabled-true-or-false"
How to use
LangGraph AI ships an MCP (Model Context Protocol) server as part of its LangGraph tooling. The MCP server is designed to expose a structured, context-aware interface that coordinates model execution, retrieval-augmented generation workflows, and agent orchestration for advanced AI tasks. With this server, you can run and test MCP clients against a local/machine-hosted endpoint, enabling you to build multi-component AI pipelines that leverage LangGraph’s RAG strategies, agent routing, and human-in-the-loop capabilities. The repository focuses on agentic RAG patterns, workflow orchestration, and MCP-based client/server interactions, so you can experiment with adaptive routing, evaluation, and optimization loops across modules.
To use the server, start the Python-based MCP server entry point defined in the project. Once running, you can connect MCP clients from your applications to perform context-aware tool calls, request execution across multiple servers or services, and exchange structured messages that include metadata, context fragments, and evaluation results. This setup supports end-to-end scenarios such as multi-hop reasoning, retrieval-augmented prompts, and orchestration patterns that coordinate multiple AI components in a single workflow.
How to install
Prerequisites:
- Python 3.10 or higher
- Git
- Optional: UV package manager (recommended) or standard Python tooling
Step 1: Clone the repository
git clone https://github.com/piyushagni5/langgraph-ai.git
cd langgraph-ai
Step 2: Install UV (recommended) or skip to plain Python install
- UV (recommended):
curl -LsSf https://astral.sh/uv/install.sh | sh
Windows:
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
Step 3: Install Python dependencies (prefer UV environment if using UV)
uv venv --python 3.10
uv pip install -r requirements.txt
If not using UV, use standard Python environments:
python -m venv .venv
source .venv/bin/activate # macOS/Linux
.venv\Scripts\activate # Windows
pip install -r requirements.txt
Step 4: Run the MCP server
# From the repository root, using the configured entrypoint
uv run -m langgraph_ai.mcp_server
Or, if not using UV:
python -m langgraph_ai.mcp_server
Step 5: Verify server is running
- Check console logs for MCP server startup messages
- Connect an MCP client to the exposed endpoint (default localhost/port as defined by the server).
Additional notes
Tips and common issues:
- Ensure API keys are provided via environment variables if the server relies on external services (ANTHROPIC_API_KEY, LANGCHAIN_API_KEY).
- If you enable tracing (LANGCHAIN_TRACING_V2), ensure the tracing backend is accessible and the project key is configured.
- The MCP server may be modular with multiple submodules (e.g., server-client, SSE-based client). Refer to the specific project README under mcp/ for module-specific notes.
- When using UV, you can manage dependencies per project to avoid cross-project conflicts; consider creating per-project venvs.
- If you run into port conflicts, check your environment for running services and adjust the server configuration accordingly (port binding, host).
- For testing MCP interactions, you can use the provided tests (uv run pytest ...) to validate client-server contract behavior.
Related MCP Servers
mcp-victoriametrics
The implementation of Model Context Protocol (MCP) server for VictoriaMetrics
MCP-MultiServer-Interoperable-Agent2Agent-LangGraph-AI-System
This project demonstrates a decoupled real-time agent architecture that connects LangGraph agents to remote tools served by custom MCP (Modular Command Protocol) servers. The architecture enables a flexible and scalable multi-agent system where each tool can be hosted independently (via SSE or STDIO), offering modularity and cloud-deployable execut
ai-learning
AI Learning: A comprehensive repository for Artificial Intelligence and Machine Learning resources, primarily using Jupyter Notebooks and Python. Explore tutorials, projects, and guides covering foundational to advanced concepts in AI, ML, DL and Gen/Agentic Ai.
AI-web mode
一个基于 MCP (Model Context Protocol) 的智能对话助手Web应用,支持实时聊天、工具调用和对话历史管理。
langchain -client
🦜🔗 LangChain Model Context Protocol (MCP) Client
Ai_agents
This is a repository of collection of many agents build on top of Langchain , Langgraph, MCP and so many amazing tools