Get the FREE Ultimate OpenClaw Setup Guide →

cloudflare-grafitti

A Cloudflare Container Worker that serves as a proxy for the Graphiti MCP (Model Context Protocol) server, providing scalable, serverless access to AI agent memory capabilities through Neo4j-backed knowledge graphs.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio adam-paterson-cloudflare-grafitti-mcp-server docker run -i knowledge-graph-mcp:0.4.0 \
  --env NEO4J_URI="Neo4j database connection string (e.g., neo4j://host:7687 or neo4j+s://... )" \
  --env NEO4J_USER="Neo4j username" \
  --env BEARER_TOKEN="Bearer token for client authentication" \
  --env NEO4J_PASSWORD="Neo4j password" \
  --env OPENAI_API_KEY="OpenAI API key"

How to use

This MCP server acts as a Cloudflare-proxied, serverless gateway for the Graphiti MCP stack. It runs as a container image and exposes an MCP-compliant endpoint that you can address with MCP protocol clients. The setup uses a Neo4j-backed knowledge graph and OpenAI capabilities behind a Cloudflare Worker container, enabling scalable, memory-enabled agents across global edge locations. To use it, configure your MCP client to send requests to the worker endpoint; the container proxy will translate and forward requests to the underlying Graphiti MCP container, which in turn interacts with Neo4j and OpenAI to read or store agent memory. For development or testing, you can also inspect MCP protocol traffic using the MCP Inspector against the worker URL. The integration is designed to work with any MCP-compliant client by sending standard MCP requests to the deployed endpoint.

How to install

Prerequisites:

  • Docker installed on your machine or hosting environment
  • Access to the knowledge-graph-mcp:0.4.0 container image (publicly available or via your registry)
  • Optional: a Neo4j database instance and an OpenAI API key

Installation steps:

  1. Pull and run the container image locally or in your deployment environment: docker run -i
    -e NEO4J_URI=neo4j://your-neo4j-instance:7687
    -e NEO4J_USER=your-username
    -e NEO4J_PASSWORD=your-password
    -e OPENAI_API_KEY=your-openai-api-key
    -e BEARER_TOKEN=your-secure-token
    knowledge-graph-mcp:0.4.0

  2. Verify the MCP endpoint is reachable. You can test with an MCP Inspector or a simple MCP client against the deployed worker URL.

  3. If deploying in production behind Cloudflare Workers, configure the environment in your deployment pipeline to inject the required secrets (NEO4J_URI, NEO4J_USER, NEO4J_PASSWORD, OPENAI_API_KEY, BEARER_TOKEN) via your chosen secret management method.

Additional notes

Notes and tips:

  • The container exposes a Graphiti MCP server backed by a Neo4j knowledge graph and OpenAI integration.
  • Ensure Neo4j access is secured (use proper credentials and network access) and that your OpenAI API key is kept secret.
  • The MCP protocol relies on standard request/response payloads; you can debug with the MCP Inspector to verify handshake, queries, and responses.
  • The Cloudflare deployment supports auto-scaling up to 5 containers; consider setting up proper rate limits and authentication to prevent abuse.
  • If you modify environment variables, restart the container so changes take effect.

Related MCP Servers

Sponsor this space

Reach thousands of developers