Get the FREE Ultimate OpenClaw Setup Guide →

cogmemai

28 MCP tools that give Ai coding assistants persistent memory across sessions. Works with Claude Code, Cursor, Windsurf, Cline, and Continue. Cloud memory — perfect for teams.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio hifriendbot-cogmemai-mcp npx -y cogmemai-mcp \
  --env COGMEMAI_API_KEY="cm_your_api_key_here"

How to use

CogmemAi is an MCP memory server that provides a cloud-backed memory layer for your AI coding assistant workflows. It enables persistent, structured memories across sessions and integrates with Claude Code, Cursor, Windsurf, Cline, Continue, and any MCP-compatible tool. The setup process via the MCP command installs and configures automatic context recovery, memory ingestion from documents, and tools for semantic search, deduplication, and privacy controls. After setup, you can launch the server through the MCP interface and connect your editor or toolchain to start saving and recalling memories as you code. The server acts as a thin HTTP client that coordinates with CogmemAi’s cloud memory, so there are no local databases to manage.

Typical workflows include running the setup wizard to authenticate with your CogmemAi API key, adding your MCP server to your editor configuration (e.g., Cursor, Windsurf, Cline, or Continue), and then using the provided commands to verify connectivity, inspect usage, or update paths. Memory ingestion can occur automatically from READMEs and project documents, and you can rely on features like semantic search, session replay, and task tracking to keep context aligned across sessions.

How to install

Prerequisites:

  • Node.js and npm (npx is provided with npm)
  • Internet access to install the MCP package and fetch dependencies
  • CogmemAi API key (for memory cloud access)

Installation steps:

  1. Ensure Node.js and npm are installed

    • macOS/Linux: node -v && npm -v
    • Windows: node -v && npm -v
  2. Install or use npx to run the CogmemAi MCP setup without global install

    • Run the interactive setup:

    npx cogmemai-mcp setup

  3. If you prefer per-project configuration, you can also add the MCP configuration manually (example:

    // In your project root, or in ~/.mcp.json { "mcpServers": { "cogmemai": { "command": "npx", "args": ["-y", "cogmemai-mcp"], "env": { "COGMEMAI_API_KEY": "cm_your_api_key_here" } } } }

  4. Verify installation: npx cogmemai-mcp verify

  5. Optional: add the server to your editor-specific MCP config (examples in the README under Works With).

Additional notes

Tips and notes:

  • You’ll typically need a CogmemAi API key to use memory features; supply it via COGMEMAI_API_KEY (or COGMEMAI_API_KEY in per-project/global configs).
  • The MCP setup configures a cloud-backed memory system; there are no local databases required.
  • If you switch editors or models, your memories persist and are available through the same MCP server configuration.
  • Common issues often relate to API key misconfiguration, network access blocks, or mismatched command paths in editor config files. Double-check the command and arguments (npx cogmemai-mcp with -y cogmemai-mcp) and ensure env vars are correctly loaded by your editor.
  • The server supports multiple tools and can be extended to other MCP-compatible integrations beyond the listed editors.

Related MCP Servers

Sponsor this space

Reach thousands of developers