fastapi -langgraph-template
A modern template for agentic orchestration — built for rapid iteration and scalable deployment using highly customizable, community-supported tools like MCP, LangGraph, and more.
claude mcp add --transport stdio nicholasgoh-fastapi-mcp-langgraph-template python -m fastapi_mcp_langgraph_template \ --env ENV="development" \ --env DATABASE_URL="postgresql://user:password@localhost:5432/dbname" \ --env MCP_API_BASE_URL="http://localhost:8000" \ --env LANGGRAPH_API_KEY="your-langgraph-api-key"
How to use
This MCP server template provides a FastAPI-based implementation designed for agentic orchestration with LangGraph and MCP. It exposes a REST API that adheres to the MCP protocol, enabling inspectors and clients to interact with the server to obtain context, manage state, and drive agent workflows. The integration with LangGraph supports customizable agent orchestration pipelines, streaming UX, and persisted chat history/state. To use it, start the server (see installation) and interact with the MCP endpoints via HTTP clients or the provided SDKs. You can also leverage LangGraph-backed agents to compose complex workflows, concatenate multiple context sources, and observe LLM metrics through LangFuse-compatible tooling if you enable those features in your environment.
How to install
Prerequisites:
- Python 3.9+ installed on your system
- Access to a terminal/command prompt
- Optional: Docker and Docker Compose if you prefer containerized runs
Installation steps:
-
Clone the repository or download the template code: git clone https://github.com/nicholasgoh/fastapi-mcp-langgraph-template.git cd fastapi-mcp-langgraph-template
-
Set up a Python virtual environment and install dependencies: python -m venv venv source venv/bin/activate # On Windows use: venv\Scripts\activate pip install -r requirements.txt
-
Configure environment variables (example):
- Create a .env file or export variables in your shell export ENV=development export DATABASE_URL=postgresql://user:password@localhost:5432/dbname export LANGGRAPH_API_KEY=your-langgraph-api-key export MCP_API_BASE_URL=http://localhost:8000
-
Run the server (development mode): uvicorn fastapi_mcp_langgraph_template.main:app --reload --host 0.0.0.0 --port 8000
Optional containerized run with Docker Compose (if provided in the repo): docker compose -f docker-compose.yaml up --build
Notes:
- If you are using a database, ensure the database server is running and accessible via DATABASE_URL.
- The server exposes MCP endpoints; you can use MCP client libraries to interact according to the MCP protocol.
- For production deployments, consider configuring a reverse proxy (Nginx) and a process manager (gunicorn/uvicorn workers) as appropriate.
Additional notes
Tips and common issues:
- Ensure the Python environment matches the project’s required versions (Python 3.9+).
- If LangGraph integration is optional in your setup, you can disable or skip related features via environment configuration.
- When using Docker, map ports appropriately and set DATABASE_URL and API keys as secrets or environment variables in your orchestrator.
- If you encounter MCP protocol negotiation errors, verify that the Inspector and server versions are compatible and that the MCP handshake endpoints are reachable.
- Check logs for any database connection errors, as these are a common startup failure point when not properly configuring DATABASE_URL.
- Consider enabling reloads during development to reflect code changes quickly, and switch to a production ASGI server setup for deployments.
Related MCP Servers
mcp -langfuse
Model Context Protocol (MCP) Server for Langfuse Prompt Management. This server allows you to access and manage your Langfuse prompts through the Model Context Protocol.
k6
k6 MCP server
mcp-raganything
API/MCP wrapper for RagAnything
docker-swarm
MCP server for Docker Swarm orchestration using FastAPI and Docker SDK
posebusters
Unofficial MCP server for PoseBusters – validate molecular poses via HTTP or Spaces using the Model Context Protocol (MCP).
agentxsuite
AgentxSuite is an open-source platform to connect, manage, and monitor AI Agents and Tools across multiple MCP servers — in one unified interface.