Get the FREE Ultimate OpenClaw Setup Guide →

graphiti

Graphiti MCP Server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio gifflet-graphiti-mcp-server uv run graphiti_mcp_server.py \
  --env NEO4J_URI="bolt://neo4j:7687" \
  --env MODEL_NAME="gpt-4.1-mini" \
  --env NEO4J_USER="neo4j" \
  --env NEO4J_PASSWORD="demodemo" \
  --env OPENAI_API_KEY="your_openai_api_key_here"

How to use

Graphiti MCP Server exposes a knowledge-graph powered MCP endpoint that integrates Neo4j with OpenAI models. It supports standard MCP transports such as SSE, WebSocket, and Stdio, enabling clients like MCP cursors and agents to request contextual data, run queries, and obtain LLM-assisted responses within a graph-based context. The server can be run via the uv wrapper, which executes the Python MCP server script graphiti_mcp_server.py, and can be deployed using Docker as part of a Docker Compose setup. Typical workflows include starting the server, connecting through the MCP endpoints (e.g., http://localhost:8000/sse for Server-Sent Events or ws://localhost:8000/ws for WebSocket), and querying or pushing context to the Neo4j-backed knowledge graph. The integration supports configurable OpenAI models, embedding models, and optional Azure OpenAI settings if used.

How to install

Prerequisites:

  • Docker and Docker Compose (for containerized deployment)
  • Python 3.10 or higher
  • OpenAI API key (or Azure OpenAI credentials if using Azure)
  • Minimum 4 GB RAM (8 GB recommended)
  • 2 GB free disk space

Installation steps:

  1. Clone the repository: git clone https://github.com/gifflet/graphiti-mcp-server.git cd graphiti-mcp-server

  2. Install or prepare the runtime:

    • Install uv (recommended): curl -LsSf https://astral.sh/uv/install.sh | sh uv sync
    • Or install Python dependencies directly: pip install -r requirements.txt
  3. Prepare environment variables (example): cp .env.sample .env

    Edit .env with your configuration, at minimum set OPENAI_API_KEY and Neo4j settings

    Example values inside .env:

    OPENAI_API_KEY=your_openai_api_key

    MODEL_NAME=gpt-4.1-mini

    NEO4J_URI=bolt://neo4j:7687

    NEO4J_USER=neo4j

    NEO4J_PASSWORD=demodemo

  4. Run the MCP server (via uv as shown in mcp_config): uv run graphiti_mcp_server.py

    Or using the 1-segment docker-based workflow:

    docker compose up -d

  5. Verify the server is listening and healthy: curl http://localhost:8000/health

    Check MCP endpoint availability

    curl http://localhost:8000/sse

  6. Optional: If using Docker Compose, start services: docker compose up -d docker compose ps

Additional notes

Tips and notes:

  • The server uses Neo4j as the knowledge graph backend; ensure the Neo4j service is accessible at the configured URI and credentials.
  • SEMAPHORE_LIMIT controls concurrent LLM calls; adjust based on rate limits and throughput needs.
  • OPENAI_BASE_URL can be used to point the OpenAI client to a proxy or custom endpoint.
  • If you plan to deploy Azure OpenAI, provide the AZURE_OPENAI_* settings and avoid mixing with standard OpenAI configuration.
  • For Cursor IDE integration, you can specify the transport and environment in the Graphiti integration examples, including using Docker-based endpoints (http://localhost:8000/sse).
  • If you encounter Docker-related issues, ensure required ports (7474, 7687, 8000) are reachable and adequate memory is allocated to Neo4j within Docker.
  • Logs can be inspected via docker compose logs -f graphiti-mcp or by tailing the container logs in uv-based runs for debugging.

Related MCP Servers

Sponsor this space

Reach thousands of developers