Get the FREE Ultimate OpenClaw Setup Guide →

myBrAIn

myBrAIn is an MCP (Model Context Protocol) server designed to provide persistent and contextual memory to language models (like Google Antigravity). It acts as a "second brain" for your development environment, allowing the AI to remember project rules, architectural decisions, and technical insights across different chat sessions

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio lilium360-mybrain python /ABSOLUTE/PATH/TO/mybrain/server.py \
  --env MYBRAIN_DATA_DIR="Path to the data directory used by myBrAIn (absolute path)"

How to use

myBrAIn is a Python-based MCP server that serves as a persistent, contextual memory layer (a "second brain") for language models and IDE integrations. It stores architectural rules, project-specific patterns, and insights gathered from your codebase, enabling consistent recall across sessions and chat interactions. The Admin UI (Streamlit) provides a visual interface to explore memories, manage exports, and inspect knowledge graphs, while the Observer runs in the background to detect drift in your codebase. Use the integration prompts and the recall/remember workflow to ensure the AI leverages stored project rules instead of relying solely on short-term context.

To connect your environment, configure the mybrain MCP server in your mcpServers array (e.g., via Docker or Python) and point your client (IDE or Antigravity) to the server endpoint. Tools available include recall_context, store_insight, and audit_codebase to actively maintain and enrich the brain. The Admin UI offers memory export/import, knowledge graphs, and a Silent Observer dashboard for monitoring changes in the codebase. Integrating with your project onboarding workflow helps the AI capture architectural DNA and remember project-specific conventions across sessions.

How to install

Prerequisites:

  • Python 3.10+ installed on your system
  • Git available to clone the repository (optional but recommended)
  • Optional: Docker and Docker Compose for containerized deployment

Step-by-step installation (local Python):

  1. Clone the repository: git clone https://github.com/lilium360/myBrAIn.git cd myBrAIn
  2. Create and activate a virtual environment:

    Windows

    python -m venv venv && .\venv\Scripts\activate

    Unix

    python3 -m venv venv && source venv/bin/activate
  3. Install dependencies: pip install -r requirements.txt
  4. Prepare environment (optional but recommended):
    • Copy the example env and adjust settings as needed cp .env.example .env
  5. Run the server locally (Python): python server.py
  6. Optional: Docker deployment
    • Ensure Docker and Docker Compose are installed
    • Use docker-compose up -d to start the services as described in the Docker deployment guide in the README
  7. Access the Admin UI (if using docker or streamlit):

Note: If you run the Python server directly, provide the absolute path to server.py in your mcp_config as shown in the example.

Additional notes

Tips and common issues:

  • Ensure you use absolute paths in the mcp_config args (the guidance in the README emphasizes absolute paths).
  • If using Docker, set MYBRAIN_DATA_DIR or similar data paths as needed by the container to persist memories.
  • The Admin UI requires Streamlit; ensure dependencies are installed if you run the admin dashboard locally.
  • After onboarding a project, remember to use recall_context to fetch project rules before reasoning, as described in the Integration Protocol.
  • Regularly export memory dumps for backup and cross-project reuse using the Admin Dashboard features.
  • If drift is detected, use audit_codebase to inspect and resolve inconsistencies between stored rules and the current codebase.

Related MCP Servers

Sponsor this space

Reach thousands of developers