Get the FREE Ultimate OpenClaw Setup Guide →

RLM-Memory

A Model Context Protocol (MCP) server that provides AI agents with persistent memory and semantic file discovery.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio jumpino27-rlm-memory-mcp-server node dist/index.js \
  --env UI_PORT="3848" \
  --env GEMINI_API_KEY="your-gemini-api-key"

How to use

RLM Memory MCP Server provides AI agents with persistent memory and a semantic file discovery service. It exposes a set of tools that allow agents to index, search, and retrieve context from tracked projects and memories, enabling richer interactions without direct file-system access. Core capabilities include initializing memory for a codebase, querying files and past memories, indexing existing codebases, and creating enriched memory entries with metadata to improve semantic search and recall. The server exposes a UI for viewing memories and testing tools, and can be integrated with external agents (Claude, Codex, Gemini, etc.) via the provided mcp configurations. Use the primary bi-directional query tool to ask the MCP about relevant files and memories, then rely on memory tooling to refine results and maintain an up-to-date semantic map of your projects.

How to install

Prerequisites:

  • Node.js (recommended latest LTS) and npm installed
  • Git (optional, for cloning)
  • Access to Gemini API key for AI features (see .env guidance below)

Install and run:

  1. Clone or download the MCP server repository
  2. Navigate to the project directory
  3. Install dependencies npm install
  4. Build the project (if required by this setup) npm run build
  5. Create configuration and environment variables
    • Create a .env file and add your Gemini API key GEMINI_API_KEY=your-gemini-api-key UI_PORT=3848 (optional)
  6. Start the UI/server npm start

Notes:

  • The MCP server expects to be run with the node entry point (dist/index.js after build) or an equivalent built artifact as configured in your environment.
  • Ensure the environment variable GEMINI_API_KEY is present for AI features that rely on Gemini.
  • If you modify the server path or deploy type (e.g., Docker), adjust the mcp_config accordingly.

Additional notes

Tips and considerations:

  • Environment variables: Keep GEMINI_API_KEY secure. Do not commit to VCS. Use a .env file or secret management in your deployment environment.
  • If you encounter port conflicts, set UI_PORT to an available port (e.g., 3850).
  • When integrating with external CLIs (Claude, Codex, Gemini), ensure their mcp configuration points to the correct node entry (dist/index.js) after building the server.
  • Regularly run rlm_verify_index to audit indexing status and ensure memory accuracy.
  • For debugging, run npm run dev to start UI with auto-reload during development.
  • If building for production, consider packaging with your preferred method (Docker, uv, etc.) and adjust mcp_config accordingly.

Related MCP Servers

Sponsor this space

Reach thousands of developers