mcp-local-memory
A lightweight, powerful local memory server for AI agents supporting text, entities, and relations. Enables persistent codebase understanding and user preference management.
claude mcp add --transport stdio beledarian-mcp-local-memory npx -y @beledarian/mcp-local-memory \ --env ARCHIVIST_STRATEGY="nlp"
How to use
The mcp-local-memory server provides a private, local, zero-docker long-term memory store for AI agents. It combines semantic (vector) search with keyword-based full-text search (FTS5) and a knowledge graph to connect memories, entities, and relations. Its cross-agent capability lets multiple MCP-enabled tools—such as Claude Desktop integrations, IDE extensions, or custom CLIs—share a unified memory pool on your machine. This enables consistent facts and recall across tools, with a persistent knowledge base that matures as you interact with it. Typical workflows include storing memories with remember_fact or remember_facts, querying with recall to retrieve context, and traversing the knowledge graph to explore entity relationships and friends-of-friends through Deep Graph queries.
How to install
Prerequisites:
- Node.js v18 or higher
- Build tools required by native dependencies (Python and C++ toolchain for better-sqlite3)
Installation steps:
-
Install via NPX (recommended):
- Use NPX to run the server without global installation:
npm install -g @beledarian/mcp-local-memory || (if you prefer to rely solely on NPX each time)
Then configure your MCP client to point at the NPX invocation (see mcp_config below).
-
Optional global installation to provide a memory command:
- npm install -g @beledarian/mcp-local-memory
- memory --help
-
If you want to contribute or run from source:
- git clone https://github.com/Beledarian/mcp-local-memory.git
- cd mcp-local-memory
- npm install
- npm run build
- npm start
Notes:
- Windows users may need C++ build tools. If you encounter gyp errors, install Windows Build Tools or enable Desktop development with C++ in Visual Studio.
- Ensure your environment variables (like ARCHIVIST_STRATEGY) are set according to your desired setup before starting the server.
Additional notes
Tips and notes:
- ARCHIVIST_STRATEGY controls automatic entity extraction behavior and can be a comma-separated list (e.g., nlp,llm). Use nlp for offline extraction and llm for AI-powered extraction when Ollama is available.
- MEMORY_DB_PATH sets the SQLite DB location; override it if you want a custom path.
- CONTEXT_WINDOW_LIMIT, CONTEXT_MAX_ENTITIES, and CONTEXT_MAX_MEMORIES tune how much context is returned in recall contexts.
- ENABLE_CONSOLIDATE_TOOL toggles the consolidated context tool useful for retrospective memory extraction.
- For real offline operation, ensure you have the necessary local LLM/embedding support (Embeddings via transformers.js ONNX and local NLP via the included tools).
- The server is designed to be privacy-first and lightweight (RAM usage typically in the tens to hundreds of MB).
Related MCP Servers
MiniMax -JS
Official MiniMax Model Context Protocol (MCP) JavaScript implementation that provides seamless integration with MiniMax's powerful AI capabilities including image generation, video generation, text-to-speech, and voice cloning APIs.
cursor10x
The Cursor10x MCP is a persistent multi-dimensional memory system for Cursor that enhances AI assistants with conversation context, project history, and code relationships across sessions.
ThinkMem
AI Memory Management MCP System for LLMs - 让LLM善用思考,善用记忆
capa
CAPA is a powerful package manager for AI agents that allows you to define skills and tools, manage credentials, and seamlessly integrate with MCP clients like Cursor and Claude
cf-worker -template
Một skeleton đầy đủ để bạn bắt đầu nhanh với Cloudflare Workers: routes, utils, services, bindings (KV/D1/R2), Durable Objects, cron.
slimcontext
MCP Server for SlimContext - AI chat history compression tools