hippocampus
Open-source MCP memory server. Universal AI memory across all platforms.
claude mcp add --transport stdio karrolcia-hippocampus npx -y hippocampus
How to use
Hippocampus is a self-hosted MCP memory server that acts as a centralized memory store for multiple AI platforms. Connect Claude, ChatGPT, Gemini, Cursor, Perplexity, and any other MCP-capable client to a single memory graph that stores entities, observations, and relationships with semantic search powered by local embeddings. The server exposes common MCP tools such as remember, recall, forget, and additional capabilities over Streamable HTTP, enabling memory sharing across tools and sessions. Once connected, you can seed memory from one AI and recall it from another, achieving cross-platform continuity for your projects and decisions.
To get started, install and run Hippocampus, then connect your first AI client to the MCP endpoint. The default MCP endpoint is http://your-host:mcp, and you can verify health with a simple HTTP call to /health. After connection, you can verify persistence by asking one AI to remember a detail and then asking another to recall it across platforms. The system encrypts the database at rest with SQLCipher (AES-256), including embeddings, so sensitive data remains protected. The README’s examples show how to connect several clients, configure their MCP endpoints, and perform a basic memory round-trip across tools.
How to install
Prerequisites:
- Node.js 18+ installed on your host
- Git
- A server or local development environment with network access
Step 1: Install
- Clone the repository and install dependencies
git clone https://github.com/karrolcia/hippocampus.git
cd hippocampus
npm install
Step 2: Configure environment
- Create the environment file and set a passphrase to encrypt the database
cp .env.example .env
Edit .env and set HIPPO_PASSPHRASE to a strong secret. Other values have sensible defaults.
Step 3: Start the server (development)
npm run dev
You should see output indicating Hippocampus is starting and the MCP endpoint is available, for example:
Hippocampus starting on http://0.0.0.0:3000
MCP endpoint: http://0.0.0.0:3000/mcp
Step 4: Verify health
curl http://localhost:3000/health
Expected: {"status":"ok","version":"0.3.1"}
Step 5: Connect clients
- Use your preferred MCP client to add Hippocampus as a memory backend, pointing to http://localhost:3000/mcp (adjust host/port as needed).
Notes:
- The first run will download the embedding model (~80MB); this may take ~a minute.
- If you plan to expose the server publicly, consider TLS termination (e.g., via Caddy) and secure authentication.
Additional notes
Tips and common issues:
- Ensure HIPPO_PASSPHRASE is kept secret; losing it means you cannot decrypt the memory.
- If you change domains or ports, update the MCP URLs in each client configuration accordingly.
- The embedding model is downloaded automatically on first run; ensure the host has network access for this step.
- For production, use a reverse proxy with TLS and consider enabling authentication on the MCP endpoint if your clients support it.
- If you run into connection issues, verify firewall rules allow traffic on the MCP port and that DNS resolves correctly for public deployments.
Related MCP Servers
openapi
OpenAPI definitions, converters and LLM function calling schema composer.
robot_MCP
A simple MCP server for the SO-ARM100 control
autoteam
Orchestrate AI agents with YAML-driven workflows via universal Model Context Protocol (MCP)
mcp-gemini
This project provides a dedicated MCP (Model Context Protocol) server that wraps the @google/genai SDK. It exposes Google's Gemini model capabilities as standard MCP tools, allowing other LLMs (like Cline) or MCP-compatible systems to leverage Gemini's features as a backend workhorse.
gtm
An MCP server for Google Tag Manager. Connect it to your LLM, authenticate once, and start managing GTM through natural language.
architect
A powerful, self-extending MCP server for dynamic AI tool orchestration. Features sandboxed JS execution, capability-based security, automated rate limiting, marketplace integration, and a built-in monitoring dashboard. Built for the Model Context Protocol (MCP).