Get the FREE Ultimate OpenClaw Setup Guide →

NeoCoder-neo4j-ai-workflow

An MCP server allowing AI assistants to use a Neo4j knowledge graph as their primary, dynamic instruction manual and long term project memory with adaptive templating and autonomous tool development tools.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio angrysky56-neocoder-neo4j-ai-workflow uvx --directory /path/to/NeoCoder-neo4j-ai-workflow run mcp_neocoder \
  --env LOG_LEVEL="INFO" \
  --env NEO4J_URL="bolt://localhost:7687" \
  --env MCP_TRANSPORT="stdio" \
  --env NEO4J_DATABASE="neo4j" \
  --env NEO4J_PASSWORD="your-neo4j-password-here" \
  --env NEO4J_USERNAME="neo4j" \
  --env PYTHONUNBUFFERED="1"

How to use

NeoCoder-neo4j-ai-workflow is an MCP server that ties a Neo4j knowledge graph to a Claude-style AI assistant, enabling context-enhanced coding workflows. It orchestrates data across a Neo4j graph, a Qdrant vector store, and other knowledge sources to provide structured reasoning, workflow templates, and provenance-aware insights. The server exposes a configurable MCP entry point named neocoder, which you run via the MCP runner (uvx in this setup) to start the Python-based MCP module that bridges the graph with AI-driven tasks. Users can leverage the system to query code-centric workflows, extract and synthesize knowledge with source-attribution, and route queries to the most appropriate data source (graph or vector store).

How to install

Prerequisites:

Installation steps:

  1. Clone the repository
git clone https://github.com/angrysky56/NeoCoder-neo4j-ai-workflow.git
cd NeoCoder-neo4j-ai-workflow
  1. Set up Python environment
pyenv install 3.11.12  # if not already installed
pyenv local 3.11.12
uv venv
source .venv/bin/activate
  1. Install dependencies
uv pip install -e '.[dev,docs,gpu]'
  1. Start required services (examples):
  • Neo4j: ensure bolt URL, username, and password are configured
  • Qdrant: optional but recommended for vector search
  1. Run the MCP server using the MCP runner (as configured in mcp_config)
# If using the example config above, start via uvx with the given directory and entrypoint
uvx --directory /path/to/NeoCoder-neo4j-ai-workflow run mcp_neocoder
  1. Optional: configure Claude Desktop integration by editing claude-app-config.json as shown in the Quick Start section of the README.

Notes:

  • Ensure environment variables match your Neo4j instance credentials.
  • If using Qdrant, start it with appropriate persistence and network ports open.
  • The project expects a Python-based MCP module named mcp_neocoder to be discoverable under the specified directory.

Additional notes

Tips and common considerations:

  • The server emphasizes provenance with citation-based analysis and full audit trails for knowledge synthesis and workflow execution.
  • Make sure to manage resource cleanup, especially Neo4j connections and background tasks, to avoid leaks.
  • If you run into connection issues, verify NEO4J_URL, NEO4J_USERNAME, NEO4J_PASSWORD, and NEO4J_DATABASE environment variables.
  • For production deployments, consider configuring proper signal handling (SIGTERM/SIGINT) and logging levels (LOG_LEVEL).
  • The setup references optional tools and integrations (e.g., WolframAlpha API for computation, arxiv-mcp-server for literature ingestion). Ensure API keys or access are configured if you plan to use those features.
  • The Lotka-Volterra ecosystem mention implies extended capabilities; ensure any dependent modules or services are properly installed and initialized if you enable those modes.

Related MCP Servers

Sponsor this space

Reach thousands of developers