Get the FREE Ultimate OpenClaw Setup Guide →

knowledge-graph-system

Kappa Graph — κ(G). A semantic knowledge graph where knowledge has weight. Extracts concepts, measures grounding strength, preserves disagreement, traces everything to source.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio aaronsb-knowledge-graph-system node server.js \
  --env KG_LOG_LEVEL="info" \
  --env KG_CONFIG_PATH="config/kg-config.yaml"

How to use

The Knowledge Graph System exposes a semantic knowledge graph with grounded grounding scores, provenance tracing, and AI-assisted memory capabilities. The MCP server allows AI assistants (like Claude or other LLM-based agents) to query, explore, and persistently memory-map the graph for long-running conversations. You can ingest documents, search by meaning, explore connections between concepts, and check the confidence of results with grounding scores. The MCP integration enables memoryful interactions where the assistant can store and reference graph concepts across sessions.

Once running, you can interact with the MCP server through the provided CLI and API endpoints. Use the kg-cli tooling to ingest documents, run concept extraction, and query the graph. The server supports operations such as ingesting PDFs or text, performing semantic search, traversing relationships, and retrieving source evidence tied to each concept. This makes it suitable for AI-assisted research, technical documentation, and agent memory use cases where persistent, grounded context is required.

How to install

Prerequisites:

  • Node.js (LTS version) installed on your machine
  • npm (comes with Node.js)
  • Git for cloning the repository (optional if you already have a copy)

Install steps:

  1. Clone the repository (or download the code): git clone https://github.com/aaronsb/knowledge-graph-system.git cd knowledge-graph-system

  2. Install dependencies for the MCP server (and CLI tooling if needed): npm install

  3. Build or prepare the server if the project uses a build step (depending on project setup): npm run build || true

  4. Start the MCP server locally (example): node server.js

  5. Verify the server is running by checking logs or hitting the MCP server API endpoint (e.g., localhost:3000 or configured port in config).

Optional: If you prefer using the CLI globally as described in the Quick Start, install the CLI: npm install -g @aaronsb/kg-cli

Note: If your deployment uses Docker or a PM2 process manager, adjust the start commands accordingly and ensure environment variables (KG_LOG_LEVEL, KG_CONFIG_PATH) are set as needed.

Additional notes

Tips and common issues:

  • Ensure Node.js version is compatible with the knowledge-graph-system codebase; upgrade/downgrade if you encounter dependency issues.
  • Set KG_CONFIG_PATH to point at your configuration file if you customize where the MCP server loads its settings.
  • If ingesting large documents, monitor memory usage and adjust graph_accel or database settings accordingly.
  • Grounding scores range from negative to positive values; use them to prioritize well-supported concepts over contested ones.
  • For AI-assisted memory, ensure the MCP server is reachable by your LLM agent and that authentication is configured if exposed to external clients.
  • When using the FUSE filesystem integration, ensure the host supports FUSE and that the user has appropriate permissions.
  • Check logs for any issues related to ingestion pipelines (FastAPI, extraction, or AGE graph store) to isolate where a failure occurs.

Related MCP Servers

Sponsor this space

Reach thousands of developers