madeinoz-knowledge-system
Knowledge pack for PAI
claude mcp add --transport stdio madeinoz67-madeinoz-knowledge-system docker run -i ghcr.io/madeinoz67/madeinoz-knowledge-system:latest \ --env LOG_LEVEL="info" \ --env NEO4J_URL="bolt://localhost:7687" \ --env NEO4J_USER="neo4j" \ --env FalkorDB_URL="http://localhost:3000" \ --env GRAPH_BACKEND="neo4j" \ --env NEO4J_PASSWORD="your-neo4j-password" \ --env MEMORY_RETENTION_POLICY="ACTIVE,DORMANT,ARCHIVED,EXPIRED"
How to use
This MCP server provides a persistent personal knowledge management system built on a Graphiti knowledge graph with optional backends (Neo4j or FalkorDB). It automatically extracts entities, discovers relationships, and enables semantic search across conversations and documents. With this MCP pack, you can capture knowledge by natural language prompts, perform semantic queries to locate related concepts, and run investigative searches to explore interconnected memories within configurable hops. Temporal tracking lets you see how knowledge evolves over time, and memory decay scoring helps prioritize important information. The included server tooling supports management tasks such as start, stop, status, and logs, making it easy to operate within your AI infrastructure.
How to install
Prerequisites:
- Docker and Docker Compose installed on your host
- Basic familiarity with running containers
Installation steps:
- Pull and run the MCP server container (example):
# Ensure Docker is running
docker version
# Run the knowledge system container (configured for Neo4j as backend by default)
docker run -d --name madeinoz-knowledge-system \
-e GRAPH_BACKEND=neo4j \
-e NEO4J_URL=bolt://localhost:7687 \
-e NEO4J_USER.neo4j \
-e NEO4J_PASSWORD=your-neo4j-password \
ghcr.io/madeinoz67/madeinoz-knowledge-system:latest
- If you prefer Docker Compose, create a docker-compose.yml with equivalent environment variables and run:
version: '3.8'
services:
knowledge-system:
image: ghcr.io/madeinoz67/madeinoz-knowledge-system:latest
environment:
GRAPH_BACKEND: neo4j
NEO4J_URL: bolt://neo4j:7687
NEO4J_USER: neo4j
NEO4J_PASSWORD: your-neo4j-password
LOG_LEVEL: info
ports:
- "7474:7474" # Neo4j UI
- "3000:3000" # FalkorDB UI
- Verify installation:
docker ps
curl -i http://localhost:7474/
curl -i http://localhost:3000/
- Configure environment specifics per your deployment (database credentials, endpoints, and backends) before production use.
Additional notes
Tips and common considerations:
- Backends: Neo4j is the recommended default for rich querying, while FalkorDB offers simpler setup and lower resource usage. Ensure the chosen backend is reachable from the MCP container.
- Environment variables: Securely manage credentials (prefer secrets management in production). Example vars include NEO4J_PASSWORD and GRAPH_BACKEND.
- Monitoring: Prometheus and Grafana dashboards are supported via the included architecture references; ensure ports are exposed if you want to visualize metrics.
- Networking: When running behind a NAT or in Kubernetes, adjust URLs (NEO4J_URL, FalkorDB endpoints) accordingly and consider using TLS.
- Data safety: Regular backups of the graph database are recommended; coordinate backup scripts with your backend choice.
- Troubleshooting: If the container fails to start, check container logs (docker logs <container>) and verify that the backend service (Neo4j/FalkorDB) is up and accessible.
Related MCP Servers
buildwithclaude
A single hub to find Claude Skills, Agents, Commands, Hooks, Plugins, and Marketplace collections to extend Claude Code, Claude Desktop, Agent SDK and OpenClaw
skrills
Coordinate skills between Codex, Copilot, and Claude Code. Validates, analyzes, and syncs skills, subagents, commands, and configuration between multiple CLIs.
ai-software-architect
AI-powered architecture documentation framework with ADRs, reviews, and pragmatic mode. Now available as Claude Code Plugin for easiest installation.
omega-memory
Persistent memory for AI coding agents
mem0 -selfhosted
Self-hosted mem0 MCP server for Claude Code. Run a complete memory server against self-hosted Qdrant + Neo4j + Ollama while using Claude as the main LLM.
claude-image-gen
AI-powered image generation using Google Gemini, integrated with Claude Code via Skills or Claude.ai via MCP (Model Context Protocol).