agentic-memory
Persistent cognitive graph memory for AI agents — facts, decisions, reasoning chains, corrections. 16 query types, sub-millisecond. Rust core + Python SDK + MCP server.
claude mcp add --transport stdio agentralabs-agentic-memory cargo install agentic-memory
How to use
AgenticMemory MCP server exposes a persistent, graph-based memory store designed for AI agents. When run, it provides tools to interact with an immortal memory graph, enabling operations such as adding facts and decisions, traversing decision chains, and performing advanced queries to retrieve context, revisions, and timelines. The MCP tooling includes a set of commands (for example, a CLI named amem) that let you: push and update memory elements, run quality checks, and synchronize memory across compatible clients. Use cases include maintaining long-term agent context, auditing decisions, and tracing the evolution of beliefs across sessions. This server is designed to work with multiple clients (Claude, Cursor, Windsurf, Cody) and supports multi-index querying to assemble precise context quickly. To start, install the server binary and run the provided CLI to interact with your agent’s memory file (a single .amem file).
How to install
Prerequisites:
- Rust toolchain with Cargo installed (needed to install the agentic-memory server).
- Internet access to fetch crates.
Step-by-step installation:
- Ensure Rust and Cargo are installed. On most systems:
- macOS: brew install rust
- Debian/Ubuntu: sudo apt-get update; sudo apt-get install -y build-essential curl; curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
- Windows: install Rust via rustup-init from https://rustup.rs
- Install the MCP server binary via Cargo:
cargo install agentic-memory - Verify installation (example invocation may vary by version):
amem --help - Prepare the memory file if needed and run the server tool in your environment as directed by the binary’s guidance.
Note: If you prefer Python-based tooling or other entrypoints, refer to the project’s specific installation notes for alternative install paths or wrappers.
Additional notes
Tips and considerations:
- The memory file format is a single binary (.amem). Ensure proper backups of this file for durability.
- If integrating with multiple LLM clients, check the MCP hardening and JSON-RPC validation settings to avoid silent fallback issues.
- Common environment variables may include paths to your memory file, log levels, and client synchronization options. If you run into sync issues, verify that the Ghost Writer integration is enabled and that client paths are writable.
- When upgrading, re-check the compatibility of the memory file with the new binary version to avoid migration pitfalls.
- For debugging, run the CLI with verbose logging (e.g., amem --verbose) to trace actions on the immortal memory graph.
Related MCP Servers
awesome-claude-skills
A curated list of awesome Claude Skills, resources, and tools for customizing Claude AI workflows
cursor-talk-to-figma
TalkToFigma: MCP integration between AI Agent (Cursor, Claude Code) and Figma, allowing Agentic AI to communicate with Figma for reading designs and modifying them programmatically.
mcp-memory-service
Open-source persistent memory for AI agent pipelines (LangGraph, CrewAI, AutoGen) and Claude. REST API + knowledge graph + autonomous consolidation.
shodh-memory
Cognitive memory for AI agents — learns from use, forgets what's irrelevant, strengthens what matters. Single binary, fully offline.
cortex-scout
An advanced web extraction and meta-search engine for AI agents. It features native parallel searching, Human-in-the-Loop (HITL) authentication fallback, and LLM-optimized data synthesis for deep web research.
last9
Last9 MCP Server