Get the FREE Ultimate OpenClaw Setup Guide →

agent-mem

High-performance, pure Go memory middleware for AI Agents (Claude/Cursor). Features active file watching, auto-distillation via LLM, and 'Single Source of Truth' versioning. Native MCP support.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio junknet-agent-mem docker run -i junknet-agent-mem

How to use

Agent Memory (Project Cortex) exposes a Go-based MCP server that acts as a lightweight, high-performance knowledge base middleware. It wires together ingestion, semantic search, and arbitration logic with a standard MCP interface, enabling clients like Claude Desktop, Cursor, and Gemini CLI to interact via MCP calls as well as HTTP endpoints. Key MCP tools you can invoke include mem.ingest_memory to add or update memory, mem.search for semantic retrieval, mem.get to fetch full memory content, mem.timeline to query memories over time, and mem.list_projects to obtain project summaries. The server can operate over HTTP or via stdio (MCP client mode), and can emit MCP streams over SSE for real-time updates. To start using it, run the binary with the desired transport (http or stdio) and point clients to the server endpoints or SSE stream as described in the README.

Once running, you can use the HTTP API to interact with memories and projects, for example POST /ingest/memory to write memory, GET /memories/search for semantic search, GET /memories to retrieve full content, GET /memories/timeline for time-based queries, and GET /projects for project summaries. If you enable a token, requests must include Authorization: Bearer <token> or X-Agent-Mem-Token: <token>, or supply the token as a query parameter for the endpoints. The MCP interface also supports streamable HTTP via /mcp and SSE via /sse for continuous updates.

For clients, integration examples are provided for Claude Desktop, Codex CLI, Gemini CLI, and Cursor, demonstrating how to configure MCP servers and, when needed, how to supply tokens for secure access. The system also includes a “conflict detection” workflow that uses vector similarity thresholds and LLM arbitration to decide between REPLACE, KEEP_BOTH, or SKIP when ingesting new content.

How to install

Prerequisites:

  • Docker and Docker Compose installed
  • Go 1.25+ if you want to build the Go binary locally
  • PostgreSQL with pgvector (for local testing, via docker-compose)

Recommended setup (local development):

  1. Start the database (with pgvector) using Docker Compose:
docker-compose up -d
  1. Build the Agent Memory MCP binary (Go):
# From repository root
cd mcp-go && go build -o ../agent-mem ./cmd/agent-mem-mcp && cd ..
  1. Prepare environment configuration (example):
cp .env.example .env
# edit .env to set keys like DASHSCOPE_API_KEY, DATABASE_URL, AGENT_MEM_HTTP_TOKEN, AGENT_MEM_OWNER_ID
  1. Run the server in HTTP or stdio mode:
# HTTP mode (default)
./agent-mem --transport http --host 127.0.0.1 --port 8787

# STDIO mode (MCP client)
./agent-mem --transport stdio
  1. Optional PATH utility installation (if you want to run from anywhere):
./bin/agent-mem-install
  1. Verify installation by invoking MCP tools, for example:
# In HTTP mode, test ingestion
curl -X POST http://127.0.0.1:8787/ingest/memory -H 'Content-Type: application/json' -d '{"id":"1","content":"example memory"}'

Prerequisites recap:

  • Docker for the database and optional containerized deployment
  • Go for building the MCP binary (optional if you use a prebuilt binary or container image)
  • A configured PostgreSQL database with pgvector support
  • Configuration of environment variables as described in the README (API keys, tokens, DB URL, etc.)

Additional notes

Tips and common considerations:

  • If you upgrade, the migration logic runs on startup and may add new fields (e.g., avg_embedding, summary, tags). Old data without those fields may be excluded from arbitration until you backfill.
  • Use --reset-db --reset-only to rebuild the database when upgrading or when starting fresh with small datasets.
  • If you enable HTTP tokens, requests can be secured using the Authorization header or token query parameters; ensure your clients are updated accordingly.
  • The MCP tools (mem.ingest_memory, mem.search, mem.get, mem.timeline, mem.list_projects) are the primary programmatic interfaces; leverage them for integration tests and automation.
  • The system emphasizes a “single truth” approach with conflict arbitration based on vector similarity and LLM judgment; tune similarity thresholds and model configurations in config/settings.yaml if needed.
  • When running via Docker, ensure the container has network access to PostgreSQL (host or docker-compose network) and proper environment variables for DB connection and tokens.
  • For client integration, you can configure Claude Desktop, Codex CLI, Gemini CLI, and Cursor with the agent-mem MCP server URL (and tokens if enabled).

Related MCP Servers

Sponsor this space

Reach thousands of developers