Get the FREE Ultimate OpenClaw Setup Guide →

Graphiti -Demo

This repo demonstrates Graphiti MCP integration with Cursor and Neo4j, storing AI prompts as real-time knowledge graphs for persistent memory.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio kartikk-26-graphiti-mcp-demo docker compose up \
  --env NEO4J_URI="bolt://localhost:7687" \
  --env MODEL_NAME="gpt-4.1-mini" \
  --env NEO4J_USER="neo4j" \
  --env NEO4J_PASSWORD="demodemo" \
  --env OPENAI_API_KEY="<your_openai_api_key>"

How to use

This MCP server (Graphiti MCP Demo) enables AI agents to discover and utilize tools that interact with Graphiti memory stored in Neo4j. It acts as a bridge between MCP clients (like Cursor or Claude) and the Graphiti memory graph, allowing the agent to query, store, and retrieve contextual data across episodes, entities, and communities. The server supports tool calls such as add_episode, search_nodes, and clear_graph to manage the knowledge graph, enabling context continuity and persistent memory for AI agents. As a result, agents can select optimal tools for a given query, retrieve relevant context from Neo4j, and persist enriched interactions back into the graph.

How to install

Prerequisites:

  • Docker (recommended) or Python with uv (uvx) installed
  • Node.js not required for this demo
  • Git installed
  • Access to a terminal

Installation steps:

  1. Clone the repository and navigate to the mcp_server directory:
git clone https://github.com/getzep/graphiti.git
cd graphiti/mcp_server
  1. Prepare environment variables in a .env file or export them in your shell. Example:
# Neo4j
export NEO4J_URI=bolt://localhost:7687
export NEO4J_USER=neo4j
export NEO4J_PASSWORD=demodemo

# OpenAI
export OPENAI_API_KEY=<your_openai_api_key>
export MODEL_NAME=gpt-4.1-mini
  1. Install dependencies if you are running locally (Python). For the Python/uvx path, dependencies should be managed by uvx; you can install Python and uv by following uvx installation instructions.
  2. Run the server via Docker (recommended):
docker compose up

Or run with Python/uvx for debugging:

uv run graphiti_mcp_server.py --model gpt-4.1-mini --transport sse

Notes:

  • Ensure Neo4j is running and accessible at the URI you configure.
  • Replace placeholder keys with real credentials before starting.

Additional notes

Tips and common issues:

  • If you encounter connection issues to Neo4j, verify NEO4J_URI, NEO4J_USER, and NEO4J_PASSWORD are correct and that Neo4j is running.
  • The model name may need to be compatible with your OpenAI API access; update MODEL_NAME accordingly.
  • When using Docker, ensure you have enough resources allocated (CPU/RAM) for the Neo4j graph and model inference.
  • For debugging, running the Python/uvx path can help you view SSE output and live console logs; use the --transport flag to switch transport methods as needed.
  • The MCP clients (Cursor, Claude) expect the Graphiti MCP server to expose endpoints like /sse; configure client mcp.json or claude_desktop_config.json accordingly.

Related MCP Servers

Sponsor this space

Reach thousands of developers