Get the FREE Ultimate OpenClaw Setup Guide →

Cognio

Persistent semantic memory server for MCP - Give your AI long-term memory that survives across conversations. Lightweight Python server with SQLite storage and semantic search.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio 0xrelogic-cognio node mcp-server/index.js \
  --env DB_PATH="Path to SQLite database (default ./data/memory.db)" \
  --env API_HOST="API hostname (default 0.0.0.0)" \
  --env API_PORT="API port (default 8080)" \
  --env EMBED_MODEL="Embedding model name (default all-MiniLM-L6-v2)" \
  --env EMBED_DEVICE="Embedding device (cpu/gpu; default cpu)" \
  --env AUTOTAG_ENABLED="true/false" \
  --env SUMMARIZATION_ENABLED="true/false"

How to use

Cognio is a MCP server that provides persistent semantic memory for AI assistants. It stores memories in a SQLite database and offers semantic search across conversations, memories, and projects. With features like LEANN vector search, multilingual capabilities, auto-tagging, and export options, Cognio aims to be a robust backend for memory-rich AI workflows. The server exposes a REST API with endpoints for saving memories, searching, listing, exporting, and summarizing content, plus a dashboard UI for interactive browsing. The MCP tooling is exposed through 11 specialized operations, including save_memory, search_memory, list_memories, and project-scoped operations like set_active_project and get_active_project, which help you organize and isolate contexts across multiple projects. On startup, Cognio can auto-configure MCP clients and generate usage documentation in cognio.md for your workspace. You can access the web UI at /ui and the API docs at /docs, with a separate interactive docs experience available at /docs. Use the provided 11 tools to manage memories, search results, and project contexts, enabling semantic retrieval and efficient memory management for AI assistants.

How to install

Prerequisites:

  • Git installed on your system
  • Node.js (recommended latest LTS) if you choose to run the MCP server directly; alternatively, Docker Compose is supported as shown in the Quick Start
  • Docker and Docker Compose if you prefer Docker-based deployment

Option A - Docker Compose (recommended to start quickly):

  1. Clone the repository and navigate to Cognio: git clone https://github.com/0xReLogic/Cognio.git cd Cognio

  2. Start the server with Docker Compose (as per Quick Start): docker-compose up -d

  3. Open the UI at http://localhost:8080/ui and API docs at http://localhost:8080/docs

Option B - Run Node MCP server directly:

  1. Install dependencies for the MCP server: cd Cognio cd mcp-server npm install

  2. Start the MCP server (adjust environment variables as needed): NODE_ENV=production API_HOST=0.0.0.0 API_PORT=8080 DB_PATH=./data/memory.db EMBED_MODEL=all-MiniLM-L6-v2 npm start

  3. The MCP server will start and expose endpoints at http://0.0.0.0:8080. Use http://localhost:8080/docs for API reference and http://localhost:8080/ui for the dashboard.

Prerequisites summary: Node.js (for direct run) or Docker/Docker Compose (recommended for ease of setup). Ensure you have network access to download dependencies and, if using auto-tagging or external embeddings, provide the necessary API keys in environment variables as described in the .env.example.

Additional notes

Tips and common considerations:

  • Environment variables mirror those in .env.example; they control embedding, search behavior, auto-tagging, and summarization. Copy .env.example to .env and customize as needed.
  • If you enable LEANN or advanced summarization, ensure sufficient memory and compute, as these features can be resource-intensive.
  • The Active Project workflow helps to isolate memories per project. Always set an active project when working across multiple workspaces to avoid cross-contamination.
  • The MCP setup script (in mcp-server/scripts/setup-clients.js) can auto-configure support for multiple clients. Running npm run setup from the mcp-server directory will generate MCP configs for supported clients.
  • If you encounter API timeouts or startup errors, check the Docker logs or server logs at the API host/port you configured. Validate that the database path exists and is writable.
  • The REST API supports exporting memories to JSON or Markdown; use /memory/export for data portability or backup.
  • The Cognio UI will auto-detect the API server; it will adapt to localhost, Docker, or remote deployments. Ensure you expose the API host/port correctly in your environment.

Related MCP Servers

Sponsor this space

Reach thousands of developers