memex
Retains Claudes session
claude mcp add --transport stdio patravishek-memex npx -y @patravishek/memex \ --env GROQ_API_KEY="your-groq-api-key" \ --env GEMINI_API_KEY="your-gemini-api-key" \ --env OPENAI_API_KEY="your-openai-api-key" \ --env LITELLM_API_KEY="your-litelLM-api-key" \ --env OLLAMA_BASE_URL="http://localhost:11434" \ --env LITELLM_BASE_URL="your-litelLM-base-url" \ --env ANTHROPIC_API_KEY="your-anthropic-api-key"
How to use
Memex is a memory layer for AI coding tools. It keeps a project-scoped, persistent memory across sessions and tools, so agents like Claude, Cursor, Copilot, or any local Ollama model can remember decisions, context, and outcomes from prior sessions. You can start a new memory session for a project with memex start claude and later resume it with memex resume claude, enabling a seamless handoff between days or between different agents. If you prefer using other agents or IDE integrations, Memex can connect via MCP to tools like Cursor or Copilot, letting the memory stay with your project rather than with a single tool. The memory is stored locally in .memex/memex.db and sessions are recorded under .memex/sessions/, enabling quick resumption without repeating past context.
How to install
Prerequisites:
- macOS or Linux (or Windows with a compatible shell)
- Node.js 18+ and npm
- An API key from Anthropic, OpenAI, or a LiteLLM enterprise proxy
- Claude CLI installed for Claude-based workflows (optional if using other agents)
Install Memex globally via npm:
# npm
npm install -g @patravishek/memex
# or pnpm/yarn (global install)
pnpm add -g @patravishek/memex
Verify installation and usage:
memex --version
memex start claude
If you plan to connect via MCP to other tools (Cursor, Copilot, etc.), you can configure MCP as described in the mcp_config section and start the Memex server accordingly.
Additional notes
Tips and notes:
- Memex operates by compressing transcripts into a structured memory at the end of each session. Ensure your shell environment variables for your chosen provider are correctly set (for example, OPENAI_API_KEY or ANTHROPIC_API_KEY).
- Memory is project-scoped. Each project has its own database at .memex/memex.db; do not share databases unless you intend to.
- When switching tools within a project, Memex will preserve the context so the new tool can pick up where the previous one left off.
- If you encounter MCP connectivity issues, ensure your environment variables (for providers or LiteLLM proxies) are exported and accessible to the process running Memex.
- For local/offline use, you can configure Ollama with a local model and set OLLAMA_BASE_URL accordingly.
Related MCP Servers
claude-code-open
Open source AI coding platform with Web IDE, multi-agent system, 37+ tools, MCP protocol. MIT licensed.
CodeMCP
Code intelligence for AI assistants - MCP server, CLI, and HTTP API with symbol navigation, impact analysis, and architecture mapping
kratos
🏛️ Memory System for AI Coding Tools - Never explain your codebase again. MCP server with perfect project isolation, 95.8% context accuracy, and the Four Pillars Framework.
metabase-ai-assistant
🚀 The most powerful MCP Server for Metabase - 111+ tools for AI SQL generation, dashboard automation & enterprise BI. Works with Claude, Cursor, ChatGPT.
mcp-client-gen
Turn any MCP server into a type-safe TypeScript SDK in seconds - with OAuth 2.1 and multi-provider support
extract-llms-docs
Extract documentation for AI agents from any site with llms.txt support. Features MCP server, REST API, batch processing, and multiple export formats.