Get the FREE Ultimate OpenClaw Setup Guide →

memoire

Experimental Semantic Memory System for Large Language Models

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio thepartyacolyte-memoire python run.py \
  --env GOOGLE_API_KEY="your-google-ai-api-key"

How to use

Memoire is an experimental MCP server that provides a semantic memory system for LLM interactions. It stores, contextualizes, and retrieves memories across conversations, enabling persistent, project-scoped memory with task management capabilities. Through its CLI tools and MCP interface, you can ingest new information, recall relevant memories through natural language queries, create and manage projects, and handle tasks tied to those projects. The system uses advanced embedding and recall strategies (including Gemini-based models) to organize information into thematic contexts and to synthesize context-aware responses when recalled by an LLM or client application.

To use Memoire, start the MCP server (via Python) and connect a client (for example Claude Desktop or a custom MCP client) configured to talk to the memoire MCP endpoint. Within the server, you can ingest content with remember(project_id, content), create or list contexts and fragments, and perform recall queries with recall(project_id, query, ...). You can also manage tasks per project (create_task, list_tasks, update_task, delete_task) and organize memories by project for complete segregation. The CLI tooling exposed by Memoire provides commands like create_project, list_projects, remember, recall, delete_fragment, list_contexts, and more, enabling hands-on memory curation and management during development and experimentation.

Operationally, Memoire integrates with a storage layer (Qdrant + SQLite) and uses specialized embedding strategies for ingestion and retrieval. It supports structured output modes and configurable models, allowing you to tailor ingestion and recall behavior. The MCP interface also enables seamless integration with GUI clients or desktop applications that support MCP, giving you a unified memory experience across tools.

How to install

Prerequisites

  • Python 3.11+ (recommended)
  • A Google AI API key for memoization and recall features (key required by default workflow)

Installation steps

  1. Clone the repository git clone https://github.com/ThePartyAcolyte/memoire cd memoire

  2. Create and activate a Python virtual environment python -m venv venv

    On Windows

    venv\Scripts\activate

    On macOS/Linux

    source venv/bin/activate

  3. Install dependencies pip install -r requirements.txt

  4. Prepare configuration

    • Obtain a Google AI API key and set it in the environment later or via the config file.
    • Ensure you have a writable data directory for storage (Qdrant/SQLite components).
  5. Run the MCP server (example)

    • Start the server which will be accessible to MCP clients.
    • Example command (adjust paths to your environment): python run.py
  6. Optional: configure an MCP client configuration file Example Claude Desktop configuration (claude_desktop_config.json): { "mcpServers": { "memoire": { "command": "/full/path/to/memoire/venv/bin/python", "args": ["/full/path/to/memoire/run.py"], "env": { "GOOGLE_API_KEY": "your-google-ai-api-key" } } } }

Notes

  • On Windows, adjust Python executable path to the virtual environment (e.g., venv\Scripts\python.exe).
  • If you plan to use WSL interoperability, consider configuring storage.data_dir to a shared path as described in the README.

Additional notes

Tips and considerations:

  • Environment variables: Set GOOGLE_API_KEY for Gemini/AI-powered ingestion and recall workflows.
  • Storage and performance: Memoire uses a combination of Qdrant and SQLite; ensure your storage directory is accessible and has adequate permissions.
  • Model configuration: The system supports replacing models for ingestion/synthesis; you can adjust config.json to swap Gemini models or embeddings as needed.
  • Data directory and cross-environment usage: If you operate across Windows/WSL, you may want to point storage.data_dir to a shared location (e.g., /mnt/c/Users/YourUser/Documents/memoire_data) to maintain data consistency.
  • CLI coverage: Familiarize yourself with commands like create_project, remember, recall, create_task, list_tasks, and the various context/fragment management operations to fully leverage semantic memory.
  • Security: Treat sensitive memory content with appropriate access controls; MEMOIRE may contain project data and tasks.
  • Debugging: Enable verbose logging if available in your local run to diagnose ingestion/recall pipelines and model selection issues.

Related MCP Servers

Sponsor this space

Reach thousands of developers