Get the FREE Ultimate OpenClaw Setup Guide →

fastapi -langgraph-template

A modern template for agentic orchestration—designed for rapid iteration and scalable deployment, using highly customizable, community-supported tools like MCP, LangGraph, and more.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio rahulsamant37-fastapi-mcp-langgraph-template python -m fastapi_mcp_langgraph_template \
  --env POSTGRES_DSN="postgresql://postgres:password@localhost:5432/dbname" \
  --env LANGFUSE_HOST="https://cloud.langfuse.com" \
  --env OPENAI_API_KEY="Your OpenAI API key" \
  --env LANGFUSE_PUBLIC_KEY="Your LangFuse public key" \
  --env LANGFUSE_SECRET_KEY="Your LangFuse secret key"

How to use

This MCP server template provides a modern backend built with FastAPI to enable agentic orchestration using LangGraph and MCP. It exposes an API-based server that can interact with a database (via SQLModel/ORM) and integrate observability and context propagation through the MCP protocol. With LangGraph, you get native support for streaming, persistent chat history, and state management to orchestrate complex agent workflows. After starting the server, you can use the MCP client to send context, queries, and control messages to agents, inspect their state, and retrieve derived observations or actions. The included architecture supports a reverse proxy and a modular template setup, allowing you to plug in additional servers (e.g., custom agents or data sources) while maintaining a consistent MCP interface.

How to install

Prerequisites:

  • Python 3.9+ installed on your system
  • Git installed
  • Optional: Docker and Docker Compose for local development and deployment
  1. Clone the template repository

    git clone https://github.com/nicholasgoh/fastapi-mcp-langgraph-template.git cd fastapi-mcp-langgraph-template

  2. (Recommended) Set up a Python virtual environment and install dependencies

    python -m venv venv

    On Windows use venv\Scripts\activate

    source venv/bin/activate pip install --upgrade pip if [ -f requirements.txt ]; then pip install -r requirements.txt; fi

    If a different dependency file exists (e.g., pyproject.toml), adjust accordingly

  3. Configure environment variables

    Create a .env file or export the required variables, for example:

    OPENAI_API_KEY=sk-... POSTGRES_DSN=postgresql://postgres:password@localhost:5432/dbname LANGFUSE_PUBLIC_KEY=pk-... LANGFUSE_SECRET_KEY=sk-... LANGFUSE_HOST=https://cloud.langfuse.com

  4. Run the server

    Using Python module execution (as per mcp_config)

    python -m fastapi_mcp_langgraph_template

    or, if you prefer directly launching with uvicorn (adjust module/path as needed)

    uvicorn fastapi_mcp_langgraph_template:app --reload --host 0.0.0.0 --port 8000

  5. Optional: start development services with Docker Compose

    docker compose -f docker-compose.yml up -d

  6. Validate MCP readiness

    • Ensure the inspector can communicate with the server via the MCP protocol
    • Verify that the LangGraph integration is active and capable of streaming

Additional notes

Tip: Keep environment variables in a secure .env file and do not commit them. The template relies on a SQL database (via SQLModel/SQLAlchemy) and LangFuse for observability; ensure your database is reachable and properly configured. If you encounter import or path errors, verify that the Python module name used in the mcp_config matches the actual entry point of the FastAPI app. For local development, using Docker Compose can simplify starting the API server, database, and any required auxiliary services together. Check the MCP Inspector connection status if you run into protocol handshake issues.

Related MCP Servers

Sponsor this space

Reach thousands of developers