Get the FREE Ultimate OpenClaw Setup Guide →

mcp-memory

MCP server from PiGrieco/mcp-memory-server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio pigrieco-mcp-memory-server python main.py \
  --env LOG_LEVEL="INFO" \
  --env MODEL_PATH="path or identifier for the ML auto-trigger model" \
  --env MONGODB_URI="MongoDB connection string"

How to use

mcp-memory (SAM) is a Python-based MCP server that provides an intelligent memory management layer for AI apps. It uses a machine learning auto-trigger model to decide when to save, retrieve, or discard memory, enhancing real-time conversations with context-aware memory. The server integrates with common AI platforms via the MCP protocol and exposes an internal auto-trigger pipeline that includes a deterministic rule set and a hybrid engine to manage memory storage and retrieval. You’ll be able to run the MCP server in a single Python process and connect clients (Cursor, Claude Desktop, Windsurf, or other MCP-compliant platforms) to enable automatic memory handling during interactions.

To use the server, start the Python MCP process (main.py) in your environment. The server expects a MongoDB backend (for memory storage and metadata) and a hosted ML model for auto-trigger decisions. Configure your platform to point to the MCP server, then interact as you normally would—the memory will be saved and searched automatically based on the model’s in-session context and the hybrid decision engine.

How to install

Prerequisites:

  • Python 3.8 or newer
  • Git
  • MongoDB (or MongoDB Atlas) available and reachable
  • Network access to download the ML auto-trigger model (e.g., from HuggingFace)
  1. Clone the repository
  1. Create and activate a virtual environment
  • python3 -m venv venv
  • source venv/bin/activate # on Unix/macOS
  • .\venv\Scripts\activate # on Windows
  1. Install Python dependencies
  • pip install --upgrade pip
  • pip install -r requirements.txt
  1. Prepare MongoDB connection
  • Ensure MongoDB is running and accessible, or set up MongoDB Atlas
  • Export the connection string as an environment variable (example):
    • export MONGODB_URI="mongodb+srv://<user>:<password>@cluster0.mongodb.net/<dbname>?retryWrites=true&w=majority"
  1. Download the ML auto-trigger model (if not included)
  • The installation process will download the ML model from HuggingFace (~63MB). Ensure network access is available during first run.
  1. Run the MCP server
  • python main.py
  1. Optional: platform integration
  • Configure your MCP clients (Cursor, Claude Desktop, Windsurf, etc.) to point to the MCP server endpoint exposed by main.py.

Notes:

  • You can adjust configuration via environment variables (see additional notes). If you’re using Docker, Node, or other deploy methods, adapt commands accordingly.

Additional notes

Tips and considerations:

  • Environment variables:
    • MONGODB_URI: Connection string for MongoDB (required for memory storage)
    • MODEL_PATH: Path or identifier for the ML auto-trigger model (defaults to internal model if not set)
    • LOG_LEVEL: Logging verbosity (e.g., INFO, DEBUG)
  • If the model download fails, ensure outbound network access and that your HuggingFace credentials (if required) are configured.
  • In production, run the MCP server behind a reverse proxy and enable TLS where appropriate.
  • The server supports MCP protocol clients (Cursor, Claude Desktop, Windsurf, etc.). Ensure the client is configured to use the MCP endpoints exposed by main.py.
  • Monitor the memory store and model logs to diagnose memory retention behavior and prompt-based memory usage.
  • If you use Docker or alternative runtimes, ensure the required environment variables are passed to the container and that MongoDB accessibility is preserved.

Related MCP Servers

Sponsor this space

Reach thousands of developers