mcp-memory-service
Open-source persistent memory for AI agent pipelines (LangGraph, CrewAI, AutoGen) and Claude. REST API + knowledge graph + autonomous consolidation.
claude mcp add --transport stdio doobidoo-mcp-memory-service python -m mcp_memory_service memory server \ --env MCP_ALLOW_ANONYMOUS_ACCESS="Enable anonymous REST API access (true/false)"
How to use
mcp-memory-service provides a persistent, shared memory backend for AI agent pipelines. It exposes a framework-agnostic REST API (about 15 endpoints) that lets agents store memories, query context, and share a typed knowledge graph of causal relationships. Memories are tagged by agent identity via the X-Agent-ID header and can optionally bypass deduplication with a conversation_id for incremental storage. The server runs locally by default (http://localhost:8000) and supports real-time updates via Server-Sent Events (SSE) whenever memories are stored or deleted. Embeddings are executed locally via ONNX, so memory never leaves your infrastructure. This makes it a drop-in, self-hosted alternative to cloud memory stores, with zero per-call API costs and full control over data retention and privacy.
To use it, install the package and start the server, then interact with the REST API from any HTTP client. Typical workflows include storing a memory with an agent ID, then searching scoped memory by query and tags. For example, you can store a memory with an agent tag, and later search within that agent’s memories. The API is designed to be framework-agnostic, so you can integrate from LangGraph, CrewAI, AutoGen, Claude Desktop, or any HTTP client without needing a dedicated MCP client library.
Key capabilities to leverage:
- REST API endpoints for storing, retrieving, and querying memories across agents and runs
- Knowledge graph support with typed edges (causes, fixes, contradicts) to capture relationships between memories
- X-Agent-ID header for per-agent scoping of memories
- conversation_id to disable deduplication for incremental conversations
- SSE notifications for real-time memory activity
- Local embeddings via ONNX for privacy and performance
How to install
Prerequisites
- Python 3.8+ installed on your system
- pip available in your PATH
- Basic knowledge of running Python packages from source or PyPI
Option A: Install from PyPI and run locally
- Install the package
pip install mcp-memory-service
- Start the memory server
MCP_ALLOW_ANONYMOUS_ACCESS=true memory server
The REST API will be available at http://localhost:8000 by default.
Option B: Install from source (for development or custom backends)
- Clone the repository
git clone https://github.com/doobidoo/mcp-memory-service.git
cd mcp-memory-service
- Install dependencies
pip install -e .
- Run the server locally
MCP_ALLOW_ANONYMOUS_ACCESS=true python -m mcp_memory_service memory server
Notes
- If you want to customize storage backends (SQLite, Cloudflare, Hybrid), follow the Advanced guidance in the README to install the appropriate components.
- Ensure port 8000 is accessible or configure a different port if needed (via server arguments or environment settings in your deployment).
Additional notes
Tips and common issues:
- Environment variable MCP_ALLOW_ANONYMOUS_ACCESS controls whether the API can be accessed without authentication. Enable it only in trusted environments.
- Memories are tagged with X-Agent-ID; include this header to scope storage and retrieval to a specific agent.
- If you run behind a proxy, ensure that CORS and SSE endpoints are properly configured for client access.
- The knowledge graph is typed (e.g., causes, fixes, contradicts); use the API to create and traverse relationships to support reasoning over memories.
- For production deployments, consider the Advanced: Custom Backends & Team Setup path to enable SQLite for local single-user use, Cloudflare for cloud sync, or Hybrid for best of both.
- If you need to override the default host/port, check your deployment environment or pass appropriate command-line flags in place of or in addition to MCP_ALLOW_ANONYMOUS_ACCESS.
Related MCP Servers
EverMemOS
Long-term memory OS for your agents across LLMs and platforms.
evo-ai
Evo AI is an open-source platform for creating and managing AI agents, enabling integration with different AI models and services.
nocturne_memory
一个基于uri而不是RAG的轻量级、可回滚、可视化的 **AI 外挂MCP记忆库**。让你的 AI 拥有跨模型,跨会话,跨工具的持久的结构化记忆。
oreilly-ai-agents
An introduction to the world of AI Agents
eion
Shared Memory Storage for Multi-Agent Systems
rtfmbro
rtfmbro provides always-up-to-date, version-specific package documentation as context for coding agents. An alternative to context7