medulla
A free, open-source, git-native knowledge engine for software projects.
claude mcp add --transport stdio skeletor-js-medulla docker run -i skeletor-js/medulla \ --env MEDULLA_HTTP_PORT="Optional HTTP port (default 3000 if using HTTP mode)" \ --env MEDULLA_LOG_LEVEL="Logging level (e.g., info, debug)"
How to use
Medulla is a free, open-source, project-scoped knowledge engine that exposes your repository data to AI tooling via the Model Context Protocol (MCP). It runs locally in your project and can be accessed through MCP tools to create, read, update, and query entities like decisions, tasks, notes, prompts, and components. By default, Medulla serves over stdio for local AI assistants, with an optional HTTP mode for web UIs and remote clients. This makes it easy to plug AI copilots, chat assistants, or other MCP-enabled tools into your development workflow and keep project knowledge semantically searchable, up-to-date, and in sync across branches.
With Medulla you can perform MCP operations such as entity_create, entity_update, entity_delete, entity_get, entity_list, search_fulltext, search_semantic, search_query, and graph_relations, among others. It also auto-generates human-readable markdown snapshots of your knowledge base, which helps you review decisions, tasks, notes, and prompts in GitHub-friendly format. You can access your data via MCP URIs like medulla://decisions, medulla://tasks/active, medulla://entity/{id}, and medulla://context/{topic}, enabling seamless integration with AI tooling and custom dashboards.
How to install
Prerequisites:
- Docker installed and running, or a compatible container runtime
- Optional: Git and a Rust toolchain if you plan to run Medulla from source
Using Docker (recommended for MCP integration):
-
Pull and run the Medulla image: docker run -d --name medulla -p 3000:3000 skeletor-js/medulla
If you prefer stdio mode only, omit port mapping and use the default stdio transport
-
Verify the container starts and exposes MCP endpoints (HTTP mode): curl http://localhost:3000/health
-
(Optional) Configure environment variables for tuning:
- MEDULLA_HTTP_PORT: port for HTTP API (default 3000)
- MEDULLA_LOG_LEVEL: log level (info, debug, warn, error)
From source (advanced):
- Ensure Rust is installed: https://www.rust-lang.org/tools/install
- Clone the repository and build:
git clone https://github.com/skeletor-js/medulla.git
cd medulla
cargo build --release
Binary will be at ./target/release/medulla
- Run the binary directly (stdio mode): ./target/release/medulla serve
- If you want HTTP, run with an HTTP option (adjust as needed): ./target/release/medulla serve --http 3000
Prerequisites recap:
- Docker or Rust toolchain installed
- Basic familiarity with MCP concepts (entities, relations, searches)
Additional notes
Tips and common issues:
- Ensure your project repository is initialized as a Medulla workspace (CRDT store, config.json, etc.) to enable full MCP functionality.
- For best MCP performance, keep the SQLite cache and embeddings cache within your repo’s .gitignore to avoid bloating commits.
- If you upgrade Medulla, re-build or re-pull the image to get the latest MCP tooling and fixes.
- When using HTTP mode, consider adding a reverse proxy and TLS in front of Medulla for production deployments.
- Environment variables can tune logging, HTTP port, and feature flags; review the Medulla docs for a full list of options.
Related MCP Servers
grepai
Semantic Search & Call Graphs for AI Agents (100% Local)
OpenContext
A personal context store for AI agents and assistants—reuse your existing coding agent CLI (Codex/Claude/OpenCode) with built‑in Skills/tools and a desktop GUI to capture, search, and reuse project knowledge across agents and repos.
vsync
Sync MCP servers, Skills, Agents & Commands across Claude Code, Cursor, OpenCode, Codex. One config, all tools.
mcp-interactive-terminal
MCP server that gives AI agents (Claude Code, Cursor, Windsurf) real interactive terminal sessions — REPLs, SSH, databases, Docker, and any interactive CLI with clean output and smart completion detection
speclock
AI Constraint Engine — memory + enforcement for AI coding tools. The only solution that stops AI from breaking what you locked. Works with Bolt.new, Lovable, Claude Code, Cursor. Free & open source.
ai-control-framework
Stop shipping non-deployable AI code. Framework with DRS scoring, contract freezing, and 30-min mock timeout. Works with Claude Code, Cursor, Copilot.