MoltBrain
Long-term memory layer for OpenClaw & MoltBook agents that learns and recalls your project context automatically.
claude mcp add --transport stdio nhevers-moltbrain npx -y moltbrain \ --env MOLTBRAIN_WORKER_PORT="37777" \ --env MOLTBRAIN_CONTEXT_OBSERVATIONS="50"
How to use
MoltBrain is a long-term memory layer designed to work with OpenClaw, MoltBook and Claude Code, enabling automatic context capture, memory recall, and semantic search across your projects. Once started, MoltBrain runs a worker service locally and exposes a web-based viewer at localhost:37777 where you can browse observations, run semantic searches, view and manage tags, and access the export and theme tooling. The system captures discoveries, decisions, and code interactions, then indexes them for fast retrieval. Use the web viewer to inspect timeline history, search context, and view results from the embedded vector store.
In practice, MoltBrain provides several capabilities: auto-capture of relevant session data and tool outputs, semantic search across your stored observations, and a web UI for navigating history, filtering with tags, and bookmarking important notes. You can export data in JSON, CSV, or Markdown with templates, and customize the experience with themes and keyboard shortcuts for rapid navigation. This makes it easier to recall prior configurations, code snippets, and integration decisions when working with Claude Code, OpenClaw, or MoltBook.
How to install
Prerequisites:
- Node.js (LTS, e.g., 18.x or newer)
- npm or yarn
- Internet access to fetch the MoltBrain package
Installation steps:
- Ensure Node.js and npm are installed:
- macOS/Linux: curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash - && sudo apt-get install -y nodejs
- Windows: install Node.js from https://nodejs.org/
- Install MoltBrain via npx (no global install required):
- Run: npx -y moltbrain This pulls the moltbrain package and starts the worker service automatically.
- (Optional) If you prefer a persistent global install during development:
- npm install -g moltbrain
- Then start with: moltbrain
- Configure runtime port and settings (recommended):
- The worker runs by default on port 37777. You can override with environment variables as shown in the mcp_config example.
- Verify the web UI:
- Open http://localhost:37777 in your browser to access the MoltBrain web viewer.
Additional notes
Tips and common issues:
- Port conflicts: If 37777 is in use, set MOLTBRAIN_WORKER_PORT to another free port in the environment before starting.
- Environment defaults: MOLTBRAIN_CONTEXT_OBSERVATIONS controls how many observations are kept in memory for quick access; adjust as needed for your workspace size.
- Data persistence: MoltBrain stores observations and summaries locally; ensure writable storage in your environment. Consider backing up the storage if you rely on long-term context.
- Compatibility: MoltBrain is designed to work with OpenClaw, MoltBook and Claude Code. Ensure your tool integrations are sending contextual data to MoltBrain for capturing and recall.
- Updates: When updating the npm package, prefer using npx to fetch the latest release or install via npm if you maintain a local development workflow.
- Security: If exposing the web viewer beyond localhost, secure it appropriately (TLS, auth, and access controls).
Related MCP Servers
cui
A web UI for Claude Code agents
penpot
Penpot's official MCP Server
recall
Persistent cross-session memory for Claude & AI agents. Self-host on Redis/Valkey, or use the managed SaaS at recallmcp.com.
mobile
A Model Context Protocol (MCP) server that provides mobile automation capabilities.
vrchat
This project is a Model Context Protocol (MCP) server for interacting with the VRChat API.
architect
A powerful, self-extending MCP server for dynamic AI tool orchestration. Features sandboxed JS execution, capability-based security, automated rate limiting, marketplace integration, and a built-in monitoring dashboard. Built for the Model Context Protocol (MCP).