brain
Your AI has amnesia. Persistent memory and cognitive context for AI. 25 MCP tools. 12ms recall.
claude mcp add --transport stdio mordechaipotash-brain-mcp pipx run brain-mcp
How to use
brain-mcp is a Python-based MCP server that helps you search and reconstruct your cognitive state from on-device conversations. It uses local storage (DuckDB for full-text search, LanceDB for vector embeddings) and provides 25 tools to analyze, summarize, and synthesize insights across your chats. After installing, you run the brain-mcp CLI to initialize, ingest conversations, and set up integrations with your AI assistants (e.g., Claude Desktop/Code). Once configured, you can simply tell your AI (Claude, Cursor, Windsurf) to use brain, and it will search your local conversations and context in real time to answer questions, surface open threads, or reconstruct decisions and open questions. The tools are grouped into 5 categories: Cognitive Prosthetic, Search, Synthesis, and more, each optimized for fast local retrieval and reasoning without cloud dependencies.
How to install
Prerequisites:
- Python 3.11 or later
- pipx (or pip) installed and on your PATH
- Optional: a local embedding model (nomic-v1.5) for semantic search
Installation steps (preferred, isolated env):
- Install brain-mcp via pipx (creates an isolated environment and puts the CLI on PATH):
pipx install brain-mcp # recommended (isolated env, on your PATH)
- Initialize and ingest your conversations:
brain-mcp init # discover your conversations
brain-mcp ingest # import them (fast, no GPU)
- Optional: configure Claude integration (Desktop/Code) and enable embedding (semantic search):
brain-mcp setup claude # auto-detect: configures both Desktop + Code
brain-mcp setup claude-desktop # Claude Desktop only
brain-mcp setup claude-code # Claude Code only
# Optional: semantic search (requires embedding model)
pipx inject brain-mcp sentence-transformers einops
brain-mcp embed
If you prefer a direct pip installation (not recommended for isolated environments):
pip install brain-mcp
brain-mcp init && brain-mcp ingest
brain-mcp setup claude
Note: If you install in a virtual environment, ensure brain-mcp is on your PATH. pipx handles this automatically.
Additional notes
Tips and gotchas:
- Semantic search is optional and downloads a ~1.5 GB embedding model on first use. You can skip it and rely on keyword search.
- Data stays on your machine. Embeddings and indices are stored locally (DuckDB + LanceDB).
- If you run in a virtualenv, ensure brain-mcp is on PATH so Claude/Desktop tools can call the CLI.
- You can customize tool usage via brain-mcp commands and tailor setup for each client (claude, claude-desktop, claude-code, cursor, windsurf).
- The MCP aims to reconstruct cognitive state: open questions, decisions, next steps, and the cost of changing focus, not just surface facts.
- If using pipx, Commands will be available as brain-mcp after installation; you can also invoke via git or local source if developing.
Related MCP Servers
mcp-memory-libsql
🧠 High-performance persistent memory system for Model Context Protocol (MCP) powered by libSQL. Features vector search, semantic knowledge storage, and efficient relationship management - perfect for AI agents and knowledge graph applications.
python-notebook
Lightweight Python Notebook MCP - Enable AI assistants to create, edit, and view Jupyter notebooks via Model Context Protocol
mie
Persistent memory graph for AI agents. Facts, decisions, entities, and relationships that survive across sessions, tools, and providers. MCP server — works with Claude, Cursor, ChatGPT, and any MCP client.
cadre-ai
Your AI agent squad for Claude Code. 17 specialized agents, persistent memory, desktop automation, and a common sense engine.
gmail
A robust Model Context Protocol server for Gmail integration with intelligent authentication and comprehensive email operations
mcp-tidy
CLI tool to visualize and manage MCP server configurations in Claude Code. List servers, analyze usage statistics, and clean up unused servers