Get the FREE Ultimate OpenClaw Setup Guide →

mie

Persistent memory graph for AI agents. Facts, decisions, entities, and relationships that survive across sessions, tools, and providers. MCP server — works with Claude, Cursor, ChatGPT, and any MCP client.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio kraklabs-mie mie --mcp

How to use

MIE (Memory Intelligence Engine) acts as a shared, persistent knowledge graph for your AI agents. It lets Claude, Cursor, ChatGPT, and other MCP clients read from and write to a single, evolving knowledge graph, enabling cross-agent context, decisions, and relationships to survive across sessions and tools. The server exposes a robust toolset via the MCP protocol, allowing agents to store facts, decisions, entities, and events, perform semantic search and graph traversals, invalidate or update content, manage conflicts, and export or repair the graph as needed. With MIE, you can query the graph to retrieve structured context before agents respond, and you can import knowledge from repositories or files using batch operations. The tools are exposed per MCP client and can be invoked through the standard JSON-RPC interface over stdio, enabling seamless integration with your existing agent workflows.

How to install

Prerequisites:

  • macOS or Linux environment
  • Homebrew installed (for macOS users) or an equivalent package manager
  • Basic familiarity with command line operations

Install steps:

  1. Tap the MIE Homebrew repository and install the CLI:

    • brew tap kraklabs/mie
    • brew install mie
  2. Initialize MIE (creates default config and local store):

    • mie init
    • You can also run an interactive setup with: mie init --interview
  3. Run the MIE MCP server (example):

    • mie --mcp
  4. Connect MCP clients (examples):

    • Claude / Cursor / etc. can reference the server via an .mcp.json (or .cursor/mcp.json) file with: { "mcpServers": { "mie": { "command": "mie", "args": ["--mcp"] } } }
  5. Optional: manage the daemon (if using a shared daemon workflow)

    • mie daemon
    • Ensure the daemon has the proper permissions and the graph database is accessible.

Note: If you’re not on macOS or prefer other installation methods, you can typically build from source or use the prebuilt binaries provided by the project releases.

Additional notes

Tips and considerations:

  • The server exposes 12 MCP tools (e.g., mie_store, mie_query, mie_bulk_store, mie_update, mie_export, mie_repair, mie_status, etc.). Use mie_status to monitor graph health and usage metrics.
  • MIE uses an agent-as-evaluator pattern: there is no server-side inference cost for storing; your agent decides what to persist.
  • For import workflows, you can leverage mie_bulk_store to batch-import knowledge from files or git histories.
  • If running multiple MCP clients, the daemon ensures an exclusive DB lock and multiplexes access. Use mie daemon to manage the lifecycle in multi-client setups.
  • When configuring clients, point them to the MCP socket/interface exposed by the mie server, e.g., via JSON-RPC over stdio, as shown in the example .mcp.json and .cursor/.mcp.json files.
  • If you encounter issues with data consistency, use mie_conflicts to detect contradictions and mie_repair to rebuild indices and clean embeddings.

Related MCP Servers

Sponsor this space

Reach thousands of developers