Get the FREE Ultimate OpenClaw Setup Guide →

persistent-ai-memory

A persistent local memory for AI, LLMs, or Copilot in VS Code.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio savantskie-persistent-ai-memory python -m ai_memory_mcp_server

How to use

Persistent AI Memory provides a Python-based MCP server that exposes memory, search, and conversation tooling for AI assistants. It persists memories, conversations, and tool usage in local databases and exposes capabilities via an MCP-compatible interface so you can plug it into Claude, OpenAI-compatible agents, or custom runners. You can interact with the server by running it directly as an MCP server and then connecting your assistant to the endpoint to store and retrieve memories, perform semantic searches, log tool calls, and inspect system health. The server supports multi-platform integration and is designed to work with the surrounding memory configuration and embedding providers for flexible deployments.

Once running, you can leverage the built-in operations such as storing memories, searching memories semantically, listing recent memories, logging tool calls, and reflecting on tool usage to improve future responses. It also offers conversation history tracking and health checks to ensure the databases and embeddings are functioning correctly. The server is designed to be easily integrated into existing AI assistants that follow the Model Context Protocol (MCP).

To use the MCP endpoints, start the server with: python -m ai_memory_mcp_server and connect your agent to the MCP interface. Typical workflows include: storing new memories after important chat events, performing semantic searches to retrieve relevant past interactions, and logging tool invocations for analysis and improvement of tool usage patterns.

How to install

Prerequisites:

  • Python 3.8+ installed on the host
  • Git (for cloning or fetching the repo)
  • Optional: pip for installing from git URLs

Install from GitHub (recommended):

  1. Install directly from the repository: pip install git+https://github.com/savantskie/persistent-ai-memory.git

  2. (Alternative) Clone and install in editable mode: git clone https://github.com/savantskie/persistent-ai-memory.git cd persistent-ai-memory pip install -e .

Configuration prerequisites:

  • Ensure the configuration files memory_config.json and embedding_config.json exist or are accessible as described in the project docs. Typical defaults are placed under ~/.ai_memory/ as shown in the documentation.
  • You may customize environment variables and paths to point to your preferred config locations.

Run the MCP server:

  • Start the server: python -m ai_memory_mcp_server
  • If your environment requires a specific Python path, use: python3 -m ai_memory_mcp_server

Optional usage notes:

  • If you want to run in a Docker environment, refer to the project Docker guidance and the configuration of the embedding/memory config files in the container.

Additional notes

Tips and common considerations:

  • The server relies on local SQLite databases for memory, conversations, tool calls, and related data. Ensure the host has sufficient disk space for growth.
  • Place embedding and memory configuration files at the locations described in the docs (default ~/.ai_memory/) or set environment variables to customize their paths.
  • For production deployments, consider mounting ~/.ai_memory/ to a persistent volume and enabling health checks to detect database issues early.
  • If you encounter import or module-not-found errors, verify that the package was installed correctly (pip list) and that you are running the server in the same Python environment where the package is installed.
  • The server supports multiple integration points; ensure your MCP client uses the correct command and endpoints per the MCP specification.
  • Environment variables can be used to override defaults, such as paths to config files or embedding providers. Refer to the project's CONFIGURATION.md for detailed options.

Related MCP Servers

Sponsor this space

Reach thousands of developers