Get the FREE Ultimate OpenClaw Setup Guide →

memorix

Cross-Agent Memory Bridge Persistent memory for AI coding agents across Cursor, Windsurf, Claude Code, Codex, Copilot, Kiro via MCP. Never re-explain your project again.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio avids2-memorix memorix serve

How to use

Memorix provides a persistent memory layer for AI coding agents. It stores decisions, observations, and context across sessions, enabling agents to recall past work without re-explaining or re-creating prior context. The server exposes a suite of MCP tools under the memorix namespace, including memory queries (memorix_store, memorix_search, memorix_detail, memorix_timeline, memorix_resolve, memorix_deduplicate, memorix_suggest_topic_key), session handling (memorix_session_start, memorix_session_end, memorix_session_context), and a knowledge graph interface (create_entities, create_relations, add_observations, delete_entities, delete_observations, delete_relations, search_nodes, open_nodes, read_graph) that is compatible with the MCP Official Memory Server. There are also workspace synchronization utilities (memorix_workspace_sync, memorix_rules_sync, memorix_skills), maintenance tools (memorix_retention, memorix_consolidate, memorix_export, memorix_import), and a web UI dashboard (memorix_dashboard).

To use Memorix, install and run the server, then configure your MCP clients to point to memorix serve. For example, in your MCP config, set the memorix server with command memorix and arguments serve. You can opt into hybrid search, enabling BM25 by default and optionally enabling embedding-based search via MEMORIX_EMBEDDING with options like api (OpenAI-compatible embedding API), fastembed (local ONNX), or transformers (local JS/WASM). When using embedding, you can customize models and API keys with MEMORIX_EMBEDDING_API_KEY, MEMORIX_EMBEDDING_BASE_URL, and MEMORIX_EMBEDDING_DIMENSIONS, depending on your provider.

Memorix also supports auto-memory hooks that capture decisions, errors, and gotchas as you work, with language support in English and Chinese. This enables seamless memory growth across sessions and tools, reducing repetitive context switching and improving agent performance over time.

How to install

Prerequisites:

  • Node.js and npm installed on your system (Memorix is published as an npm package).
  • A supported MCP environment or agent setup to consume MCP servers.

Step-by-step installation:

  1. Install Memorix globally via npm:

    npm install -g memorix

  2. Verify installation:

    memorix --version

  3. Start the Memorix server (default development mode):

    memorix serve

  4. Optional: configure environment for embeddings (if you plan to use embedding-based search):

    • Set embedding mode, e.g. MEMORIX_EMBEDDING=api
    • Provide API keys and endpoints as needed, e.g. MEMORIX_EMBEDDING_API_KEY=your-key MEMORIX_EMBEDDING_BASE_URL=https://api.openai.com/v1
  5. Connect your MCP clients by updating their mcp_config to point to the memorix server with the command memorix and args ["serve"].

  6. (Optional) Follow the project-specific full setup guide for additional tuning and troubleshooting.

Additional notes

Tips and considerations:

  • Do NOT use npx to run Memorix on every invocation; install globally and use memorix serve to avoid re-downloading on each start.
  • If you plan to use embedding features, choose MEMORIX_EMBEDDING=api for OpenAI-compatible APIs or one of the local options (fastembed, transformers) based on your resource availability.
  • When deploying in multi-agent environments, ensure memory and workspace synchronization tools are configured consistently across agents to maintain coherent context.
  • Review MEMORIX_EMBEDDING_API_KEY and MEMORIX_EMBEDDING_BASE_URL in cloud-based setups; secure handling of keys is essential.
  • The Memorix dashboard (memorix_dashboard) provides a visual knowledge graph and observation browser for easier debugging and memory management.

Related MCP Servers

Sponsor this space

Reach thousands of developers