long-term-memory
Large language model persistent memory module that lives outside the model itself
claude mcp add --transport stdio rotoslider-long-term-memory-mcp python D:\a.i. apps\long_term_memory_mcp\LongTermMemoryMCP.py
How to use
The long_term_memory MCP provides a persistent, human-like long-term memory system for LM Studio. It combines a structured SQLite database for metadata with a semantic search layer powered by ChromaDB, backed up with portable JSON exports. This enables cross‑chat, cross‑model, and cross‑machine continuity so your AI companion can recall prior conversations, preferences, and events over decades. Tools are exposed to the MCP runtime (and used internally by the AI) to store, retrieve, update, and manage memories without exposing the underlying database to the user. The core capabilities are designed to feel invisible to users while delivering natural recall and context across sessions.
Key tools include remember for storing new memories with metadata and importance, search_memories for semantic recall using natural language queries, and a suite of helpers like search_by_type, search_by_tags, get_recent_memories, update_memory, delete_memory, get_memory_stats, and create_backup. These tools empower the AI to store facts, conversations, preferences, and events, retrieve relevant memories on demand, monitor memory health, and trigger manual backups when needed. For best results, use search_memories with natural language questions (e.g., "What did I say about John's birthday last week?") and rely on remember to capture important context for later recall.
To use the tools inside LM Studio, the MCP automatically wires these operations into the model’s decision loop. Developers or power users can test operations via internal tool calls or, if exposed, via the LM Studio MCP Tool Loader by selecting the long_term_memory tool set and issuing the standard parameter payloads described in the README.
How to install
Prerequisites:
- Python 3.8+ (or the version compatible with the MCP)
- git
- internet access for initial package downloads
Step 1: Clone the repository
git clone https://github.com/Rotoslider/long-term-memory-mcp.git
cd long-term-memory-mcp
Step 2: Install requirements
pip install -r requirements.txt
Notes:
- The package list includes chromadb, sentence-transformers, fastmcp, and built-in sqlite3 (which is part of Python).
- Do not install sqlite3 separately, as Python includes it.
Step 3: Optional performance improvement for HuggingFace models
pip install "huggingface_hub[hf_xet]"
Step 4: Run the MCP (example configuration shown below in the config snippet; adjust paths as needed)
Ensure you have the correct Python interpreter and path to LongTermMemoryMCP.py
You can also adapt this to your environment or use an environment manager
Step 5: Configure in LM Studio
- Edit your LM Studio mcp.json to point to the python interpreter and the LongTermMemoryMCP.py script as shown in the example in the README.
Additional notes
Environment variables and data location:
- You can customize the data directory via environment variables, e.g. set AI_COMPANION_DATA_DIR to point to your data folder. On Windows PowerShell: $env:AI_COMPANION_DATA_DIR="D:\a.i. apps\long_term_memory_mcp\data". On Linux/macOS: export AI_COMPANION_DATA_DIR="/home/username/ai_companion_data".
- Backups are created automatically: every 24 hours and after every 100 memories (configurable), stored under memory_backups/ with timestamped folders. Only the last 10 backups are kept.
- If you move to another machine, copy memory_db/ and memory_backups/ along with the system prompt to preserve continuity.
- The MCP exposes tools that are intended to be called internally by the model. For troubleshooting, you can inspect memory counts and backups using get_memory_stats and create_backup.
- Windows paths in the example config should be adapted to your installed Python and project layout. Ensure the path to LongTermMemoryMCP.py is correct and accessible by LM Studio.
- If you encounter memory inconsistencies after model restarts, consider triggering create_backup to force a fresh, consistent snapshot before continuing work.
Related MCP Servers
persistent-ai-memory
A persistent local memory for AI, LLMs, or Copilot in VS Code.
heimdall
Your AI Coding Assistant's Long-Term Memory
knowledgegraph
MCP server for enabling persistent knowledge storage for Claude through a knowledge graph with multiple storage backends and fuzzy search
lmstudio-toolpack
A MCP stdio toolpack for local LLMs
sqlite-literature-management-fastmcp
A flexible system for managing various types of sources (papers, books, webpages, etc.) and integrating them with knowledge graphs.
vector
Vector MCP Server for AI Agents - Supports ChromaDB, Couchbase, MongoDB, Qdrant, and PGVector