claude_codex_gemini_mem
跨窗口/跨会话 AI 记忆系统 (Claude Code / Gemini / Codex)
claude mcp add --transport stdio changgely-claude_codex_gemini_mem npx tsx src/servers/mcp-server.ts \ --env PORT="37777" \ --env LOG_LEVEL="info"
How to use
Claude Codex Gemini Mem provides a persistent semantic memory layer that bridges Claude Code, Gemini, and Codex workflows. The MCP server hosts a memory/search service that can be queried to retrieve past tool outputs, decisions, and session context across windows and sessions. Once running, you can use the included memory search interface to perform natural-language queries like “show me decisions from last week about feature X” or “summarize the latest tool outputs for project Y.” It also enables cross-platform memory injection, allowing you to retrieve and reuse relevant context from one tool or session in another.
To interact with the server, run it using the recommended startup command (via npx tsx) and then access the memory web dashboard at http://localhost:37777 to monitor and manage memory nodes, recalls, and privacy tags. The server supports a privacy feature to wrap sensitive data in <private> tags so it won’t be stored in memory. This MCP server is designed to work with Claude Code, Gemini CLI, and Codex CLI integrations, enabling unified context preservation and cross-session continuity across these platforms.
How to install
Prerequisites:
- Node.js v18 or newer
- Bun (recommended for worker execution performance)
- Access to the project repository or a clone of claude_codex_gemini_mem
Steps:
-
Install dependencies (from the project root):
- If using npm: npm install
- If using Bun (recommended): bun install
-
Start the Memory Engine worker (optional but recommended): bun plugin/scripts/worker-service.cjs (This starts the central worker that the MCP server will utilize.)
-
Run the MCP server: npx tsx src/servers/mcp-server.ts
-
Verify the server is running by visiting the web dashboard: http://localhost:37777
-
If needed, configure environment variables (see additional_notes) and re-run the server.
Notes:
- The server is designed to work in conjunction with Claude Code, Gemini CLI, and Codex CLI integrations.
- If you prefer Docker or another runtime, you can adapt the mcp_config accordingly (see JSON spec in this document).
Additional notes
Tips and common issues:
- Dashboard: Use the Web UI at http://localhost:37777 to inspect memory nodes, recalls, and privacy tags.
- Privacy: Wrap sensitive content in <private>...</private> to exclude it from memory storage.
- If the server starts but the dashboard is unavailable, ensure your port (default 37777) isn’t used by another process.
- Environment variables you may configure: PORT: Port for the web dashboard (default 37777) LOG_LEVEL: Logging verbosity (e.g., debug, info, warn, error)
- For debugging, run the MCP server with the environment variable LOG_LEVEL=debug to get more verbose output.
- This MCP server is designed to function across Claude Code, Gemini CLI, and Codex CLI ecosystems, enabling cross-session continuity and unified context management.
Related MCP Servers
systemprompt-code-orchestrator
MCP server for orchestrating AI coding agents (Claude Code CLI & Gemini CLI). Features task management, process execution, Git integration, and dynamic resource discovery. Full TypeScript implementation with Docker support and Cloudflare Tunnel integration.
mcp-memory-keeper
MCP server for persistent context management in AI coding assistants
mem0 -selfhosted
Self-hosted mem0 MCP server for Claude Code. Run a complete memory server against self-hosted Qdrant + Neo4j + Ollama while using Claude as the main LLM.
cco
Real-time audit and approval system for Claude Code tool calls.
docmole
Dig through any documentation with AI - MCP server for Claude, Cursor, and other AI assistants
mcp-install-instructions-generator
Generate MCP Server Installation Instructions for Cursor, Visual Studio Code, Claude Code, Claude Desktop, Windsurf, ChatGPT, Gemini CLI and more