Get the FREE Ultimate OpenClaw Setup Guide →

CogniLayer

Persistent memory for Claude Code & Codex CLI — save ~100K tokens/session. 13 MCP tools, hybrid search, TUI dashboard, crash recovery. Your AI finally remembers.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio lakyfx-cognilayer python -m cognilayer \
  --env API_KEY="your-api-key-if-needed" \
  --env LOG_LEVEL="INFO" \
  --env DATABASE_URL="postgresql://user:password@host:5432/dbname" \
  --env MEMORY_DB_URL="redis://localhost:6379/0"

How to use

CogniLayer is a Python-based MCP server designed to provide persistent memory, code intelligence, and subagent context tooling to your AI workflows. Once running, it enables memory-backed knowledge across sessions, semantic search over facts, and code-context insights to help you understand dependencies, call graphs, and potential impact when refactoring. The server exposes its capabilities through the MCP toolset, so agents can perform memory searches, analyze code structure via tree-sitter-based context, and leverage subagent results stored in a central DB for faster, targeted queries. Use CogniLayer to reduce token burns during long sessions and accelerate debugging with precise, context-aware answers.

How to install

Prerequisites:

  • Python 3.11 or newer
  • Git
  • Access to a PostgreSQL-compatible database (or adjust to your preferred METADATA store) and a Redis instance for memory caching

Installation steps:

  1. Clone the repository: git clone https://github.com/lakyfx-cognilayer/cognilayer.git cd cognilayer

  2. Create and activate a virtual environment (optional but recommended): python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate

  3. Install dependencies: pip install -r requirements.txt

  4. Configure environment variables (example; adapt to your setup): export DATABASE_URL="postgresql://user:password@host:5432/dbname" export MEMORY_DB_URL="redis://localhost:6379/0" export LOG_LEVEL="INFO"

  5. Run the server: python -m cognilayer

  6. Verify the server starts and is reachable at the configured host/port (default port as per project configuration). If needed, customize the module name and entrypoint to suit your deployment.

Optional: For production, consider containerization (Docker) or a process manager (systemd, PM2-equivalent for Python) and ensure proper security hardening for database and memory store connections.

Additional notes

Tips and common considerations:

  • Ensure your database and Redis instances are accessible to the CogniLayer process; network ACLs and firewall rules can block startup.
  • Start with LOG_LEVEL=INFO or DEBUG during troubleshooting, then scale down to INFO or WARNING in production.
  • If you already host memory and code graphs elsewhere, you can adjust MEMORY_DB_URL and related configs to point to your existing stores.
  • The MCP config example uses a single CogniLayer server; you can extend the mcpServers section to run multiple named instances if needed.
  • When upgrading Python or dependencies, re-run the install step and check for compatibility issues with tree-sitter-based parsing components.
  • Store-sensitive keys (API keys, database credentials) securely using your infrastructure’s secret management rather than hard-coding in env files.

Related MCP Servers

Sponsor this space

Reach thousands of developers