MemCP
Portable Agent Memory. Connect your AI conversations. Remember everything. Everywhere.
claude mcp add --transport stdio ardaaltinors-memcp docker run -i ardaaltinors/memcp \ --env MEMCP_API_KEY="your-api-key" \ --env MEMCP_BROKER_URL="amqp://guest:guest@localhost:5672/" \ --env MEMCP_QDRANT_HOST="localhost" \ --env MEMCP_QDRANT_PORT="6333" \ --env MEMCP_DATABASE_URL="postgresql://USER:PASS@HOST:PORT/DB" \ --env MEMCP_SERVICE_PORT="4200"
How to use
MemCP is a portable memory management server that implements the Model Context Protocol (MCP) to store and retrieve long-term memories for AI assistants. It provides core tools to add, fetch, and relate memories while offering a visual dashboard for management and a semantic search capability across memories. To use MemCP, configure your MCP-enabled AI agents to point to the MemCP endpoint (cloud or local) and authenticate with the API key from your dashboard. The server exposes endpoints for remembering facts, recording and retrieving user context, and performing semantic memory lookups. With the memory graph and profile synthesis, you can build persistent user profiles that enrich AI responses across conversations and platforms.
How to install
Prerequisites:
- Docker installed and running
- Access to the MemCP Docker image (ardaaltinors/memcp) or build your own image
- Optional: a PostgreSQL instance and a Qdrant vector store if you’re not using the included defaults
Install steps:
-
Pull and run MemCP via Docker:
docker pull ardaaltinors/memcp docker run -d --name memcp
-e MEMCP_API_KEY=your-api-key
-e MEMCP_DATABASE_URL=postgresql://USER:PASS@HOST:PORT/DB
-e MEMCP_QDRANT_HOST=localhost
-e MEMCP_QDRANT_PORT=6333
-e MEMCP_BROKER_URL=amqp://guest:guest@localhost:5672/
-p 4200:4200
ardaaltinors/memcp -
If you prefer running locally without Docker, follow the project’s RUNNING_LOCALLY.md for Python/FastAPI setup steps (install dependencies, set up PostgreSQL and Qdrant, run the FastAPI server with uvicorn, and configure Celery/RabbitMQ).
-
Obtain your API key from the MemCP dashboard and update the MEMCP_API_KEY environment variable accordingly.
-
Point your MCP-enabled AI agents to the MemCP endpoint (cloud or local http://localhost:4200) and begin storing and retrieving memories.
Additional notes
Tips and considerations:
- Ensure PostgreSQL and Qdrant services are reachable from the MemCP container if you’re not using the bundled defaults.
- Use the API key from the dashboard for authentication when connecting AI assistants.
- The three core MCP tools are remember_fact (store facts), record_and_get_context (process and fetch user context), and get_related_memory (semantic memory search).
- If you encounter connectivity issues, verify Docker networking and that the MEMCP_BROKER_URL (RabbitMQ) is accessible.
- For production deployments, consider securing environment variables, enabling TLS for API endpoints, and configuring proper scale for Celery workers.
Related MCP Servers
context7
Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
obsidian -tools
Add Obsidian integrations like semantic search and custom Templater prompts to Claude or any MCP client.
MiniMax -JS
Official MiniMax Model Context Protocol (MCP) JavaScript implementation that provides seamless integration with MiniMax's powerful AI capabilities including image generation, video generation, text-to-speech, and voice cloning APIs.
mcp-bundler
Is the MCP configuration too complicated? You can easily share your own simplified setup!
akyn-sdk
Turn any data source into an MCP server in 5 minutes. Build AI-agents-ready knowledge bases.
promptboard
The Shared Whiteboard for Your AI Agents via MCP. Paste screenshots, mark them up, and share with AI.