hive-memory
Cross-project memory for AI coding agents. MCP server that maintains context, decisions, and knowledge across workspaces. Fully local.
claude mcp add --transport stdio moonx010-hive-memory hive-memory \ --env CORTEX_DATA_DIR="Default data storage directory (typically ~/.cortex)" \ --env CORTEX_LOCAL_SYNC="Default true; set to "false" to disable writing .cortex.md into project directories" \ --env CORTEX_LOCAL_FILENAME="Default local context filename, typically .cortex.md"
How to use
Hive Memory is an MCP server that provides a cross-project memory layer for AI coding agents. It stores decisions, learnings, session progress, and project context in a local knowledge base under ~/.cortex/, enabling agents to resume work across different workspaces and projects. The server acts as a meta-layer above individual tools like Claude Code, Codex, and other coding assistants, aggregating memories and context so agents can recall prior sessions and apply previous insights across projects. The memory is organized in a local JSON/Markdown store, with separate subdirectories per project for summaries, memories, and session logs. Tools exposed by Hive Memory include project management (registering, listing, updating, searching, and onboarding projects), memory storage and recall, session progress saving, and group/guide management for shared context. It also supports semantic search over memories with multiple backends (native Rust, JavaScript transformers, or keyword-only fallbacks) to help locate relevant memories quickly.
To use Hive Memory, install the npm package globally (npm install -g hive-memory) and configure your MCP clients to point to the hive-memory command. Typical usage involves registering projects with hive-memory, saving memories or decisions as you work, and periodically saving sessions to capture progress. You can access tools such as memory_store, memory_recall, project_register, project_list, and session_save through the MCP interface. If you enable semantic search, you can rely on the integrated embedding backends to find relevant memories across projects, improving recall and continuity for your AI agents.
How to install
Prerequisites
- Node.js and npm installed on your system
- Access to the command line / shell
Installation steps
- Install Hive Memory globally
npm install -g hive-memory
- Start the Hive Memory server (example)
hive-memory
- Verify installation
hive-memory --version
- Optional: configure environment variables in your MCP config to customize data directory, sync behavior, and filename
Example MCP config snippet:
{
"mcpServers": {
"hive-memory": {
"command": "hive-memory",
"env": {
"CORTEX_DATA_DIR": "/custom/path/.cortex",
"CORTEX_LOCAL_SYNC": "false",
"CORTEX_LOCAL_FILENAME": ".cortex.md"
}
}
}
}
- Integrate with your MCP client by referencing the hive-memory command as shown in your client’s configuration.
Additional notes
Tips and common issues:
- Data location: The default storage is ~/.cortex. If you move it, update CORTEX_DATA_DIR in your MCP config to point to the new location.
- Local sync: By default, Hive Memory writes a .cortex.md file into each registered project directory. Disable with CORTEX_LOCAL_SYNC=false if you don’t want per-project context files.
- Permissions: If you encounter permission errors writing to the data directory, ensure your user has read/write access to the target folder.
- Semantic search: For best performance, consider enabling the Native (Rust) backend by running the recommended build steps (cd native && npm install && npm run build). If that’s not feasible, you can fall back to the JS transformer backend or the keyword-only option.
- Data format: Memories are stored as plain JSON and Markdown in ~/.cortex/. Ensure you back up this directory if you rely on memory continuity.
- CLI usage: Explore tools like project_register, memory_store, memory_recall, and session_save to manage projects and memories programmatically via MCP clients.
Related MCP Servers
omega-memory
Persistent memory for AI coding agents
ThinkMem
AI Memory Management MCP System for LLMs - 让LLM善用思考,善用记忆
zotero -lite
Zotero MCP Lite: Fast, Customizable & Light Zotero MCP server for AI research assistants
cc-session-search
MCP server for searching and analyzing Claude Code conversation history
mcpman
The package manager for MCP servers — install, manage & monitor across Claude Desktop, Cursor, VS Code, Windsurf
simple-notify
MCP server from pintar-team/simple-notify-mcp