nocturne_memory
一个基于uri而不是RAG的轻量级、可回滚、可视化的 **AI 外挂MCP记忆库**。让你的 AI 拥有跨模型,跨会话,跨工具的持久的结构化记忆。
claude mcp add --transport stdio dataojitori-nocturne_memory python backend/mcp_server.py \ --env DATABASE_URL="sqlite+aiosqlite:///C:/path/to/nocturne_memory/demo.db" \ --env VALID_DOMAINS="core,writer,game,notes" \ --env CORE_MEMORY_URIS="core://agent,core://my_user,core://agent/my_user"
How to use
Nocturne Memory is an MCP-based long-term memory server for AI agents. It provides a persistent, hierarchical memory store with an API and dashboard that lets agents read, create, update, and organize memories using a URI-driven topology. Core concepts include an identity layer (system boot memories), an accessible memory graph (core://, writer://, game://, etc.), and a snapshot-based audit trail for human supervision. Agents interact with the server through the MCP interface exposed by the Python backend, enabling structured memory management beyond simple vector stores. Tools available include read, write (create/update/delete), and path-based navigation, plus a management dashboard for auditing changes and rolling back snapshots when needed. The system supports automated boot loading of core memories on startup and conditionally-disclosed memory routing to inject memories based on context. To use it, start the backend MCP server and connect your agent client (Claude/Cursor/OpenAI-style MCP client) with the nocturne_memory server configuration, then begin by invoking system://boot to awaken the agent with its core memories.
How to install
Prerequisites:
- Python 3.10+ installed on your machine
- Git installed
- Optional: a database path accessible by your user (absolute path recommended)
Step-by-step:
-
Clone the repository git clone https://github.com/Dataojitori/nocturne_memory.git cd nocturne_memory
-
Install backend dependencies pip install -r backend/requirements.txt
-
Prepare environment configuration cp .env.example .env
Edit .env as needed, or use the explicit environment variables provided in MCP config
Example for Windows/Linux (paths must be absolute for DATABASE_URL):
DATABASE_URL=sqlite+aiosqlite:///C:/path/to/nocturne_memory/demo.db
-
Start the MCP server (as described in the README example) uvicorn backend.main:app --reload --port 8000 # if using the FastAPI app for API
For the MCP server specifically, run the Python entrypoint as configured in your environment
-
Connect an MCP client Provide the MCP client with a configuration that points to the nocturne_memory server, e.g.: { "mcpServers": { "nocturne_memory": { "command": "python", "args": ["backend/mcp_server.py"] } } }
-
Initialize core memories Send a request to read system://boot and then write or adjust core://agent and core://my_user memories as needed.
Notes:
- Ensure the DATABASE_URL path is absolute to avoid multiple databases being created for the server and the web frontend.
- If you plan to run in a virtual environment, point the MCP config to the virtualenv’s Python executable and module path.
Additional notes
Tips and common issues:
- Use absolute database paths in DATABASE_URL to avoid data fragmentation between the MCP server and the web backend.
- The VALID_DOMAINS setting controls the namespaces in which the agent can create memories; add domains like work or research if needed.
- CORE_MEMORY_URIS defines which memories are loaded on startup via system://boot. Update them to seed the agent with identity and relationships.
- If you see permission or path errors, verify that the user running the MCP server has read/write access to the database file and the path in DATABASE_URL.
- The system uses a two-layer architecture (backend API + MCP server). You can run the frontend dashboard separately to visualize and audit memories.
Related MCP Servers
sandboxed.sh
Self-hosted orchestrator for AI autonomous agents. Run Claude Code & Open Code in isolated linux workspaces. Manage your skills, configs and encrypted secrets with a git repo.
ollama
An MCP Server for Ollama
claude-code-open
Open source AI coding platform with Web IDE, multi-agent system, 37+ tools, MCP protocol. MIT licensed.
metabase
Metabase MCP server provides integration with the Metabase API, enabling LLM with MCP capabilites to directly interact with your analytics data, this server acts as a bridge between your analytics platform and conversational AI.
omega-memory
Persistent memory for AI coding agents
claude-vigil
🏺 An MCP server for checkpointing and file recovery in Claude Code