better-mem0
Self-hosted MCP Server for AI memory with PostgreSQL (pgvector).
claude mcp add --transport stdio n24q02m-better-mem0-mcp uvx better-mem0-mcp@latest \ --env API_KEYS="GOOGLE_API_KEY:your-api-key-here" \ --env DATABASE_URL="postgresql://user:pass@your-neon-or-supabase-host/neondb?sslmode=require"
How to use
better-mem0 is a self-hosted MCP server that provides a persistent AI memory system backed by PostgreSQL with the pgvector extension. It supports a multi-provider memory workflow, allowing you to store and retrieve memory records using memory operations and to configure a chain of LLM providers for generation and retrieval tasks. The server exposes tools to add, search, list, and delete memory entries, enabling you to build interactive experiences that remember user preferences and context between sessions. To get started, install the uvx-based runtime, supply a PostgreSQL connection string, and provide your API keys for the LLM providers you want to use. You can then start the server and interact with memory via the built-in tools and memory API.
How to install
Prerequisites:\n- A PostgreSQL database with the pgvector extension (Neon or Supabase free tiers supported)\n- API keys for your preferred LLM providers (e.g., Google, OpenAI, Anthropic, Groq, etc.)\n- Python 3.x and uvx runtime (via instructions below)\n\nInstallation steps (uvx route):\n1) Install uvx if you haven't already: follow the uvx installation guide for your platform.\n2) Prepare a PostgreSQL database URL and API keys:\n - DATABASE_URL=postgresql://user:pass@host:port/dbname?sslmode=require\n - API_KEYS=GOOGLE_API_KEY:your-key,OPENAI_API_KEY:your-key\n3) Start the server using uvx:\n uvx better-mem0-mcp@latest\n4) Verify the server is running and accessible at the configured port and host.\n\nAlternative (docker) route if you prefer Docker: provide the docker-based run command and environment variables in your mcp.json as shown in the legacy example.\n\nNotes:\n- The project is discontinued as of February 2026; the replacement solution is being developed at mnemo-mcp. Use this for reference or migration to the newer solution.
Additional notes
Tips and caveats:\n- Environment variables: DATABASE_URL and API_KEYS are required. API_KEYS should be in the format ENV_VAR:key pairs, comma-separated.\n- The memory graph is stored in PostgreSQL with pgvector support for embedding vectors, enabling semantic search over memories.\n- If you switch providers or update keys, ensure the LLM_MODELS and EMBEDDER_MODELS values (if used) are compatible with your provider keys.\n- This repository is archived/ discontinued; consider migrating to mnemo-mcp for ongoing support.\n- If you experience connectivity issues, verify that the database endpoint is accessible from your host and that SSL requirements are satisfied.
Related MCP Servers
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
mcp_server_filesystem
MCP File System Server: A secure Model Context Protocol server that provides file operations for AI assistants. Enables Claude and other assistants to safely read, write, and list files in a designated project directory with robust path validation and security controls.
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.
mcp-searxng-enhanced
Enhanced MCP server for SearXNG: category-aware web-search, web-scraping, and date/time retrieval.
Python-Runtime-Interpreter
PRIMS is a lightweight, open-source Model Context Protocol (MCP) server that lets LLM agents safely execute arbitrary Python code in a secure, throw-away sandbox.
mcp-manager
CLI tool for managing Model Context Protocol (MCP) servers in one place & using them across them different clients