Get the FREE Ultimate OpenClaw Setup Guide →

medulla

A free, open-source, git-native knowledge engine for software projects.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio skeletor-js-medulla docker run -i skeletor-js/medulla \
  --env MEDULLA_HTTP_PORT="Optional HTTP port (default 3000 if using HTTP mode)" \
  --env MEDULLA_LOG_LEVEL="Logging level (e.g., info, debug)"

How to use

Medulla is a free, open-source, project-scoped knowledge engine that exposes your repository data to AI tooling via the Model Context Protocol (MCP). It runs locally in your project and can be accessed through MCP tools to create, read, update, and query entities like decisions, tasks, notes, prompts, and components. By default, Medulla serves over stdio for local AI assistants, with an optional HTTP mode for web UIs and remote clients. This makes it easy to plug AI copilots, chat assistants, or other MCP-enabled tools into your development workflow and keep project knowledge semantically searchable, up-to-date, and in sync across branches.

With Medulla you can perform MCP operations such as entity_create, entity_update, entity_delete, entity_get, entity_list, search_fulltext, search_semantic, search_query, and graph_relations, among others. It also auto-generates human-readable markdown snapshots of your knowledge base, which helps you review decisions, tasks, notes, and prompts in GitHub-friendly format. You can access your data via MCP URIs like medulla://decisions, medulla://tasks/active, medulla://entity/{id}, and medulla://context/{topic}, enabling seamless integration with AI tooling and custom dashboards.

How to install

Prerequisites:

  • Docker installed and running, or a compatible container runtime
  • Optional: Git and a Rust toolchain if you plan to run Medulla from source

Using Docker (recommended for MCP integration):

  1. Pull and run the Medulla image: docker run -d --name medulla -p 3000:3000 skeletor-js/medulla

    If you prefer stdio mode only, omit port mapping and use the default stdio transport

  2. Verify the container starts and exposes MCP endpoints (HTTP mode): curl http://localhost:3000/health

  3. (Optional) Configure environment variables for tuning:

    • MEDULLA_HTTP_PORT: port for HTTP API (default 3000)
    • MEDULLA_LOG_LEVEL: log level (info, debug, warn, error)

From source (advanced):

  1. Ensure Rust is installed: https://www.rust-lang.org/tools/install
  2. Clone the repository and build: git clone https://github.com/skeletor-js/medulla.git cd medulla cargo build --release

    Binary will be at ./target/release/medulla

  3. Run the binary directly (stdio mode): ./target/release/medulla serve
  4. If you want HTTP, run with an HTTP option (adjust as needed): ./target/release/medulla serve --http 3000

Prerequisites recap:

  • Docker or Rust toolchain installed
  • Basic familiarity with MCP concepts (entities, relations, searches)

Additional notes

Tips and common issues:

  • Ensure your project repository is initialized as a Medulla workspace (CRDT store, config.json, etc.) to enable full MCP functionality.
  • For best MCP performance, keep the SQLite cache and embeddings cache within your repo’s .gitignore to avoid bloating commits.
  • If you upgrade Medulla, re-build or re-pull the image to get the latest MCP tooling and fixes.
  • When using HTTP mode, consider adding a reverse proxy and TLS in front of Medulla for production deployments.
  • Environment variables can tune logging, HTTP port, and feature flags; review the Medulla docs for a full list of options.

Related MCP Servers

Sponsor this space

Reach thousands of developers