Get the FREE Ultimate OpenClaw Setup Guide →

Adaptive-Graph-of-Thoughts

LLM Reasoning Framework for Scientific Research

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio saptadey-adaptive-graph-of-thoughts-mcp-server python -m adaptive_graph_of_thoughts_mcp_server \
  --env NEO4J_URI="bolt://localhost:7687" \
  --env NEO4J_USER="neo4j" \
  --env NEO4J_PASSWORD="password"

How to use

Adaptive Graph of Thoughts (AGoT) is a high-performance MCP server that exposes an 8-stage reasoning pipeline built on top of a Neo4j knowledge graph. It implements the ASR-GoT framework, enabling iterative question decomposition, evidence integration from PubMed and other sources, pruning, subgraph extraction, synthesis, and reflection to produce a coherent, evidence-backed answer. The server is accessible via MCP-compatible JSON-RPC endpoints (e.g., /mcp) and can be integrated with clients like Claude Desktop or VS Code MCP extensions for interactive reasoning sessions.

To use the server, run it with the recommended Python command and configure the MCP client to point to the /mcp endpoint. The MCP workflow accepts a scientific question, runs through the eight stages (initialization, decomposition, hypothesis, evidence, pruning, subgraph extraction, synthesis, and reflection), and returns a reasoned answer along with the associated graph data. You can leverage the included endpoints for querying the graph, inspecting reasoning traces, and validating snapshots of the evidence graph as you interact with a client.

Key capabilities include: dynamic confidence scoring and uncertainty quantification, real-time evidence integration from PubMed, Scholar, and Exa search, and an async FastAPI-based API layer that supports high-throughput requests. The system is designed to be cloud-ready with Docker and Kubernetes (Helm) support, allowing you to deploy scalable MCP-powered reasoning services in production environments.

How to install

Prerequisites:

  • Python 3.11+ installed on your system
  • Access to a running Neo4j instance (default bolt URL bolt://localhost:7687)
  • Node.js (optional, for MCP client usage) if you plan to test with Node-based tooling

Install steps:

  1. Create and activate a Python virtual environment (recommended)
python -m venv venv
source venv/bin/activate  # on macOS/Linux
venv\Scripts\activate     # on Windows
  1. Install requirements (adjust as needed if a requirements.txt exists in the repo)
pip install -r requirements.txt
  1. Ensure Neo4j is running and accessible with the configured credentials. Update the environment variables if you use a custom setup (see mcp_config.Env section).
  2. Start the MCP server using the recommended command:
python -m adaptive_graph_of_thoughts_mcp_server
  1. Verify the server is healthy via the /health endpoint (or by running a minimal MCP RPC call from your client).
  2. If you plan to run with Docker, build and deploy per the repository’s Dockerfile and Helm charts (see Docker + Kubernetes support in the README).

Additional notes

Environment and configuration tips:

  • The server uses a Neo4j graph database as its knowledge store. Ensure Neo4j is reachable and that the credentials in mcp_config.env match your setup.
  • The MCP endpoints typically include /mcp for JSON-RPC, /health for health checks, and /graph for graph exploration. Configure your MCP client to target the /mcp endpoint with proper authentication tokens if required by your deployment.
  • If you encounter connectivity issues to Neo4j, verify network access, Bolt port exposure, and authentication settings. Consider running Neo4j inside a container with appropriate data persistence.
  • For production deployments, enable TLS/HTTPS, set proper bearer-auth permissions in your MCP client, and monitor the 8-stage pipeline latency to tune performance.
  • You can customize the 8-stage pipeline behavior by adjusting parameters in the config YAML and environment variables referenced by the Core/Application layer.

Related MCP Servers

Sponsor this space

Reach thousands of developers