Get the FREE Ultimate OpenClaw Setup Guide →

langgraph -agents

LangGraph-powered ReAct agent with Model Context Protocol (MCP) integration. A Streamlit web interface for dynamically configuring, deploying, and interacting with AI agents capable of accessing various data sources and APIs through MCP tools.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio teddynote-lab-langgraph-mcp-agents uvx LangGraph-MCP-Agents \
  --env USER_ID="admin (if USE_LOGIN is true)" \
  --env USE_LOGIN="true|false (optional, default false)" \
  --env USER_PASSWORD="admin123 (if USE_LOGIN is true)" \
  --env OPENAI_API_KEY="your_openai_api_key (optional)" \
  --env ANTHROPIC_API_KEY="your_anthropic_api_key (optional)" \
  --env LANGSMITH_API_KEY="your_langsmith_api_key (optional)" \
  --env LANGSMITH_PROJECT="LangGraph-MCP-Agents (optional)" \
  --env LANGSMITH_TRACING="true|false (optional)" \
  --env LANGSMITH_ENDPOINT="https://api.smith.langchain.com (optional)"

How to use

LangGraph Agents + MCP provides a Streamlit-based UI that lets you configure and run MCP-enabled tools alongside a ReAct agent. The app exposes a dynamic tool management interface where you can add, remove, and configure MCP tools in Smithery JSON format, and then run the agent that orchestrates those tools via MCP. To get started, install the Python package for LangGraph MCP Agents, set up API keys in a .env file, and run the application through the uvx entry point. You can then open the web UI, select tools, apply the configuration, and interact with the agent in the chat interface. The system streams tool calls and agent responses in real time, and you can review conversation history to fine-tune tool usage. The project emphasizes modular MCP tool integration through Smithery-compatible configurations, enabling flexible tool composition without restarting the server.

How to install

Prerequisites: Python 3.12+ installed, access to the repository, and optional Docker if you prefer containerized deployment.

Direct install from source (uvx):

  1. Clone the repository: git clone https://github.com/teddylee777/langgraph-mcp-agents.git cd langgraph-mcp-agents

  2. Create and activate a Python virtual environment: uv venv uv pip install -r requirements.txt source .venv/bin/activate # On Windows: .venv\Scripts\activate

  3. Create and populate .env with your API keys (copy from .env.example): cp .env.example .env

    Edit .env to include your keys as needed, e.g. API keys for OpenAI, Anthropic, LangSmith

  4. Install the uvx runner (if not already installed) and run the MCP server:

    Example to run via uvx (package name might be LangGraph-MCP-Agents or the correct distribution name)

    uvx LangGraph-MCP-Agents

  5. Access the UI in your browser (default port 8585) and begin configuring MCP tools through the Streamlit interface.

If you prefer Docker: follow the Quick Start Docker instructions in the README to run the pre-built containers and point them to your .env file.

Prerequisites recap:

  • Python 3.12+ (or a compatible environment)
  • Internet access to install dependencies
  • Optional API keys for services you plan to use (OpenAI, Anthropic, LangSmith)
  • Access to uvx runner or Docker for deployment

Additional notes

Tips and caveats:

  • Not all API keys are required; only supply those you intend to use.
  • Use the login feature by setting USE_LOGIN=true and providing USER_ID and USER_PASSWORD if your deployment requires authentication.
  • The UI enables dynamic tool management without restarting the server; add tools via Smithery JSON and apply changes to update the agent configuration.
  • If you run locally, ensure the .env file is correctly referenced by the uvx environment so that API keys are loaded at runtime.
  • If you encounter port or networking issues, adjust the Streamlit or server configuration accordingly and ensure environment variables do not collide with other services.
  • Refer to the Hands-on tutorial notebook MCP-HandsOn-KOR.ipynb for deep-dive examples of MCP client/server setup, RAG integration, and mixed transport methods.

Related MCP Servers

Sponsor this space

Reach thousands of developers