Get the FREE Ultimate OpenClaw Setup Guide →

mem0 -selfhosted

Self-hosted mem0 MCP server for Claude Code. Run a complete memory server against self-hosted Qdrant + Neo4j + Ollama while using Claude as the main LLM.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio elvismdev-mem0-mcp-selfhosted uvx --from git+https://github.com/elvismdev/mem0-mcp-selfhosted.git mem0-mcp-selfhosted \
  --env MEM0_USER_ID="your-user-id" \
  --env MEM0_PROVIDER="ollama" \
  --env MEM0_LLM_MODEL="qwen3:14b" \
  --env MEM0_ANTHROPIC_TOKEN="optional-override-token-if-needed"

How to use

mem0 is a self-hosted MCP server for Claude Code that provides a complete memory management suite powered by mem0. It leverages a local or cloud LLM setup (Anthropic/Claude by default or fully local Ollama) and integrates with a Qdrant vector store, Ollama embeddings, and an optional Neo4j graph for knowledge relationships. The server exposes 11 MCP tools designed to manage memory persistence, search, and organization across sessions, enabling memory-driven interactions in development environments. To run it, install uvx and load the server from GitHub, then connect via Claude Code’s MCP integration. If you prefer a fully local setup, set MEM0_PROVIDER=ollama and adjust embedding/LLM settings as needed. The default configuration targets local services (Qdrant at localhost:6333 and Ollama embeddings at localhost:11434 with the bge-m3 model).

How to install

Prerequisites:

  • Python 3.10 or newer
  • uvx (Python-based MCP runner) installed on your system
  • Access to the required services: Qdrant, Ollama, and optionally Neo4j and Google API (for graph providers)

Installation steps:

  1. Install Python 3.10+ from the official Python website or your system package manager.
  2. Install uvx (the MCP runner). Example (adjust if your environment uses a different install method):
    • python3 -m pip install --user uvx
    • or install via your preferred method per the uvx documentation
  3. Install the mem0 MCP self-hosted server by pulling it via uvx from GitHub:
  4. Start and configure the server with environment variables as needed. Example (local/local-ollama default):
  5. Verify connectivity from Claude Code or your MCP client and ensure Qdrant, Ollama, and optional Neo4j services are reachable at their configured endpoints.

Notes:

  • You can customize endpoints and models through MEM0_* environment variables as described in the repository documentation.
  • If you are using Claude Code with OAT tokens, the server can use Claude Code session tokens automatically when MEM0_PROVIDER is not set to ollama.

Additional notes

Tips and common questions:

  • By default the server assumes local services: Qdrant at localhost:6333 and Ollama embeddings at localhost:11434 with the bge-m3 model. Override via MEM0_EMBED_URL or MEM0_LLM_URL if needed.
  • For fully local operation, set MEM0_PROVIDER=ollama; if you want Claude Code tokens, ensure MEM0_ANTHROPIC_TOKEN is set or rely on Claude Code’s OAT token via ~/.claude/.credentials.json.
  • If you encounter connectivity issues, verify that Qdrant, Ollama, and Neo4j (if used) are running and reachable at the expected ports.
  • The server exposes 11 MCP tools for memory management; consult the project's CLAUDE.md integration notes to wire these tools into Claude Code workflows effectively.

Related MCP Servers

Sponsor this space

Reach thousands of developers