Mira
Local MCP server that gives Claude Code persistent context, code intelligence, and background analysis. Runs on your machine, stored in SQLite.
claude mcp add --transport stdio conarylabs-mira npx -y conarylabs/mira \ --env MIRA_DB_DIR="Directory for Mira SQLite databases (default: ~/.mira)" \ --env MIRA_CONFIG_DIR="Path to Mira config directory (default: ~/.mira)"
How to use
Mira is a local MCP server that runs alongside Claude Code and Gemini CLI to provide persistent context and code intelligence. It tracks sessions, indexes your codebase with tree-sitter, and injects relevant context into prompts via lifecycle hooks. With Mira, your decisions, preferences, and code structure persist across sessions in SQLite databases stored locally, and you can access tools like session recap, semantic code search, and goal tracking directly from Claude Code or Gemini. The server talks to Claude Code over stdio and can optionally leverage OpenAI embeddings for enhanced semantic search if you enable them.
Once Mira is installed and running, you can use its slash-like commands within Claude Code and via Gemini CLI to interact with your context: recap for session context, search for code using semantic queries, manage cross-session goals, view diffs, and surface insights from background analysis. The tooling is designed to work offline for local-first workflows, with optional cloud-backed embeddings if you enable API keys. No cloud accounts are required; all data stays on your machine unless you choose to export or share it.
How to install
Prerequisites:
- A working Claude Code installation or Gemini CLI access to run MCP servers over stdio.
- A supported environment for binaries (Linux/macOS; Windows optional via WSL or compatible CI).
Install methods:
- Using the prebuilt binary via MCP wrapper (recommended):
- Ensure Node.js and npm are installed.
- Install Mira via MCP: npm install -g mira-cli || true
- Run OCR-based setup and start Mira: mira plugin install mira mira setup
- Build from source (recommended if you want to compile locally):
- Prerequisites: Rust and cargo for compilation, Git.
- Clone the repository: git clone https://github.com/ConaryLabs/Mira.git
- Build the binary: cd Mira cargo build --release
- Run the binary locally (adjust paths if needed): ./target/release/mira
- Alternative methods mentioned in the project docs:
- cargo install mira
- Manual binary download from releases and place in PATH
Configuration guidance:
- Mira stores data in ~/.mira by default. You can override this with MIRA_DB_DIR and related environment variables. Ensure the directories exist and are writable by your user.
- If you enable OpenAI embeddings for semantic search, provide your OpenAI API key via the standard OPENAI_API_KEY env var or your preferred provider configuration as outlined in the docs.
Additional notes
Tips and common issues:
- Mira relies on two SQLite databases: one for sessions/goals/memories and one for the code index. Ensure there is sufficient disk space for indexing large codebases.
- If you see permission errors, verify that the Mira data directory is writable and not mounted read-only by your environment.
- For semantic search without keys, Mira uses keyword/search-friendly fallbacks. Enabling embeddings will require an OpenAI API key or equivalent provider configuration.
- When integrating with Claude Code or Gemini, ensure Mira is running and reachable by the MCP bridge over stdio. Misconfigurations in the command path or environment variables can prevent the MCP handshake from completing.
- You can customize environment variables for profiling, logging, or directory locations as described in the documentation under Configuration.
Related MCP Servers
mem0 -selfhosted
Self-hosted mem0 MCP server for Claude Code. Run a complete memory server against self-hosted Qdrant + Neo4j + Ollama while using Claude as the main LLM.
spec-kit
MCP server enabling AI assistants to use GitHub's spec-kit methodology
claude-vigil
🏺 An MCP server for checkpointing and file recovery in Claude Code
create -kit
Scaffold a production-ready Model Context Protocol (MCP) server in seconds.
CogniLayer
Persistent memory for Claude Code & Codex CLI — save ~100K tokens/session. 13 MCP tools, hybrid search, TUI dashboard, crash recovery. Your AI finally remembers.
bindly-claude-code
Knowledge completion layer for Claude Code - finish your thoughts and make them reusable across sessions and agents