mcp-neuralmemory
Persistent Knowledge Graph Memory for AI Coding Agents. Adds long-term context (goals, strategies, preferences) to Cursor, Windsurf, & VS Code via MCP.
claude mcp add --transport stdio hexecu-mcp-neuralmemory python -m kg_mcp --transport stdio \ --env LLM_MODE="gemini_direct" \ --env LOG_LEVEL="INFO" \ --env NEO4J_URI="bolt://127.0.0.1:7687" \ --env NEO4J_USER="neo4j" \ --env KG_MCP_TOKEN="your-secure-token" \ --env GEMINI_API_KEY="YOUR_GOOGLE_AI_STUDIO_KEY" \ --env NEO4J_PASSWORD="YOUR_NEO4J_PASSWORD"
How to use
kg-memory is a persistent memory layer for AI coding agents built on the MCP protocol. It maintains a Knowledge Graph of your project to provide context across sessions, tracking goals, constraints, strategies, and user preferences, while linking code relationships to specific files. The server exposes tools that enable active context injection and semantic graph search, so your agents can retrieve relevant past attempts, successful approaches, and coding guidelines before making decisions. Use kg_autopilot to automatically pull relevant context and past outcomes for your current task, and kg_track_changes to ensure file modifications are reflected in the Knowledge Graph. The system supports semantic traversal (k-hop search) to find connected context like related user models, utilities, and security constraints, rather than relying on simple keyword matching. Depending on your editor setup, configure the MCP bridge (mcp_config.json) to point to kg-memory so your editor can request context during development and maintain a cohesive memory across sessions.
How to install
Prerequisites:
- Python 3.11+ installed on your system
- Docker (for running Neo4j locally, if desired)
- Gemini API Key from Google AI Studio
Option A: One-Line Install (Recommended)
pipx install kg-mcp
kg-mcp-setup
The setup wizard will guide you through:
- Verifying Docker and Neo4j availability
- Providing your Gemini API Key
- Choosing the LLM mode (Direct vs LiteLLM)
- Creating a secure .env configuration
If you do not have pipx installed, install it first (see below) or use the standard pip path:
Option B: Standard Pip (no pipx)
```bash
pip install kg-mcp
kg-mcp-setup
Option C: Manual Development Setup
# Clone the repository
git clone https://github.com/Hexecu/mcp-neuralmemory.git
cd mcp-neuralmemory
# Set up environment
cp .env.example .env
# (Edit .env with your credentials)
# Install dependencies
cd server
pip install -e .
# Start Neo4j (if using a local graph database)
docker compose up -d
# Initialize Schema
python -m kg_mcp.kg.apply_schema
Prerequisites in more detail:
- Python 3.11+ installed
- A running Neo4j instance (local via Docker or remote)
- Gemini API Key for the LLM access
- Network access for PEM-protected or API-based LLM calls
Additional notes
Environment variables in the mcp_config.json should be set to secure values. Replace placeholders like YOUR_NEO4J_PASSWORD and YOUR_GOOGLE_AI_STUDIO_KEY with your actual credentials. If you use a remote Neo4j, ensure network access (bolt port 7687). When using the Antigravity IDE or similar editors, you may need to adapt the mcp_config.json to point to your Python virtual environment path. If you encounter connectivity issues, verify that Docker is running (for local Neo4j) and that the Gemini API Key is valid. The KG_MCP_TOKEN token helps restrict access to the MCP server; rotate it periodically. For development, you can enable verbose logging (LOG_LEVEL=DEBUG) to troubleshoot tool calls and graph queries.
Related MCP Servers
seline
Seline is a local-first AI desktop application that brings together conversational AI, visual generation tools, vector search, and multi-channel connectivity in one place.
mem0
✨ mem0 MCP Server: A memory system using mem0 for AI applications with model context protocl (MCP) integration. Enables long-term memory for AI agents as a drop-in MCP server.
omega-memory
Persistent memory for AI coding agents
browserai
A powerful Model Context Protocol (MCP) server that provides an access to serverless browser for AI agents and apps
serper
A Serper MCP Server
mem0 -selfhosted
Self-hosted mem0 MCP server for Claude Code. Run a complete memory server against self-hosted Qdrant + Neo4j + Ollama while using Claude as the main LLM.