Get the FREE Ultimate OpenClaw Setup Guide →

memex

Retains Claudes session

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio patravishek-memex npx -y @patravishek/memex \
  --env GROQ_API_KEY="your-groq-api-key" \
  --env GEMINI_API_KEY="your-gemini-api-key" \
  --env OPENAI_API_KEY="your-openai-api-key" \
  --env LITELLM_API_KEY="your-litelLM-api-key" \
  --env OLLAMA_BASE_URL="http://localhost:11434" \
  --env LITELLM_BASE_URL="your-litelLM-base-url" \
  --env ANTHROPIC_API_KEY="your-anthropic-api-key"

How to use

Memex is a memory layer for AI coding tools. It keeps a project-scoped, persistent memory across sessions and tools, so agents like Claude, Cursor, Copilot, or any local Ollama model can remember decisions, context, and outcomes from prior sessions. You can start a new memory session for a project with memex start claude and later resume it with memex resume claude, enabling a seamless handoff between days or between different agents. If you prefer using other agents or IDE integrations, Memex can connect via MCP to tools like Cursor or Copilot, letting the memory stay with your project rather than with a single tool. The memory is stored locally in .memex/memex.db and sessions are recorded under .memex/sessions/, enabling quick resumption without repeating past context.

How to install

Prerequisites:

  • macOS or Linux (or Windows with a compatible shell)
  • Node.js 18+ and npm
  • An API key from Anthropic, OpenAI, or a LiteLLM enterprise proxy
  • Claude CLI installed for Claude-based workflows (optional if using other agents)

Install Memex globally via npm:

# npm
npm install -g @patravishek/memex

# or pnpm/yarn (global install)
pnpm add -g @patravishek/memex

Verify installation and usage:

memex --version
memex start claude

If you plan to connect via MCP to other tools (Cursor, Copilot, etc.), you can configure MCP as described in the mcp_config section and start the Memex server accordingly.

Additional notes

Tips and notes:

  • Memex operates by compressing transcripts into a structured memory at the end of each session. Ensure your shell environment variables for your chosen provider are correctly set (for example, OPENAI_API_KEY or ANTHROPIC_API_KEY).
  • Memory is project-scoped. Each project has its own database at .memex/memex.db; do not share databases unless you intend to.
  • When switching tools within a project, Memex will preserve the context so the new tool can pick up where the previous one left off.
  • If you encounter MCP connectivity issues, ensure your environment variables (for providers or LiteLLM proxies) are exported and accessible to the process running Memex.
  • For local/offline use, you can configure Ollama with a local model and set OLLAMA_BASE_URL accordingly.

Related MCP Servers

Sponsor this space

Reach thousands of developers