Get the FREE Ultimate OpenClaw Setup Guide →

agentic-memory

Persistent cognitive graph memory for AI agents — facts, decisions, reasoning chains, corrections. 16 query types, sub-millisecond. Rust core + Python SDK + MCP server.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio agentralabs-agentic-memory cargo install agentic-memory

How to use

AgenticMemory MCP server exposes a persistent, graph-based memory store designed for AI agents. When run, it provides tools to interact with an immortal memory graph, enabling operations such as adding facts and decisions, traversing decision chains, and performing advanced queries to retrieve context, revisions, and timelines. The MCP tooling includes a set of commands (for example, a CLI named amem) that let you: push and update memory elements, run quality checks, and synchronize memory across compatible clients. Use cases include maintaining long-term agent context, auditing decisions, and tracing the evolution of beliefs across sessions. This server is designed to work with multiple clients (Claude, Cursor, Windsurf, Cody) and supports multi-index querying to assemble precise context quickly. To start, install the server binary and run the provided CLI to interact with your agent’s memory file (a single .amem file).

How to install

Prerequisites:

  • Rust toolchain with Cargo installed (needed to install the agentic-memory server).
  • Internet access to fetch crates.

Step-by-step installation:

  1. Ensure Rust and Cargo are installed. On most systems:
    • macOS: brew install rust
    • Debian/Ubuntu: sudo apt-get update; sudo apt-get install -y build-essential curl; curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
    • Windows: install Rust via rustup-init from https://rustup.rs
  2. Install the MCP server binary via Cargo:
    cargo install agentic-memory
    
  3. Verify installation (example invocation may vary by version):
    amem --help
    
  4. Prepare the memory file if needed and run the server tool in your environment as directed by the binary’s guidance.

Note: If you prefer Python-based tooling or other entrypoints, refer to the project’s specific installation notes for alternative install paths or wrappers.

Additional notes

Tips and considerations:

  • The memory file format is a single binary (.amem). Ensure proper backups of this file for durability.
  • If integrating with multiple LLM clients, check the MCP hardening and JSON-RPC validation settings to avoid silent fallback issues.
  • Common environment variables may include paths to your memory file, log levels, and client synchronization options. If you run into sync issues, verify that the Ghost Writer integration is enabled and that client paths are writable.
  • When upgrading, re-check the compatibility of the memory file with the new binary version to avoid migration pitfalls.
  • For debugging, run the CLI with verbose logging (e.g., amem --verbose) to trace actions on the immortal memory graph.

Related MCP Servers

Sponsor this space

Reach thousands of developers