Get the FREE Ultimate OpenClaw Setup Guide →

MARM-Systems

Turn AI into a persistent, memory-powered collaborator. Universal MCP Server (supports HTTP, STDIO, and WebSocket) enabling cross-platform AI memory, multi-agent coordination, and context sharing. Built with MARM protocol for structured reasoning that evolves with your work.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio lyellr88-marm-systems python marm-mcp-server \
  --env PYTHONUNBUFFERED="1"

How to use

MARM MCP Server provides a persistent memory layer that sits underneath AI agents, enabling long-term recall, session continuity, and structured memory across multiple tools. The server can operate over HTTP (for external clients) or via STDIO (for integration with local pipelines and agent processes). With the MARM MCP setup, you can connect various AI clients (e.g., Claude, Gemini, Qwen) through an MCP transport and leverage universal memory, notebooks, and semantic search to recall decisions, configurations, code, and rationale across sessions and tools. The Quick Start demonstrates both Docker and Python-based HTTP/STDIO usage, including commands to join the memory layer to an AI transport for cross-agent memory sharing and seamless recall across agents. The server exposes 18 MCP tools, which provide complete MCP coverage for common memory and session-management tasks, enabling unified memory across different AI clients.

To use it, install the server via Python (pip) or Docker, start the server, and then add an MCP transport (for example, with claude mcp add --transport http marm-memory http://localhost:8001/mcp). This establishes a memory-backed context that your agents can query and update, ensuring persistent context even across restarts. When using HTTP, you can point clients to http://localhost:8001/mcp to leverage semantic search, structured sessions, and cross-agent recall; via STDIO, you integrate memory memory directly into local AI workflows with the stdio transport and provided Python path to the server.

Overall, MARM MCP acts as a universal, persistent memory backbone for diverse AI clients, enabling cross-tool recall, unified session logs, and reusable notebooks so agents can remember, reference, and build on prior work together.

How to install

Prerequisites:

  • Python 3.10+ installed on your system
  • pip available in your PATH
  • Optional: Docker if you prefer containerized deployment

Install (HTTP/Local Python) with pip:

  1. Install the server package: pip install marm-mcp-server==2.2.6
  2. Install dependencies: pip install -r marm-mcp-server/requirements.txt
  3. Run the MCP server: python marm-mcp-server
  4. Connect an MCP transport (example with Claude): claude mcp add --transport http marm-memory http://localhost:8001/mcp

Install (STDIO) if you plan to use the stdio transport:

  1. Install the server package: pip install marm-mcp-server==2.2.6
  2. Install stdio dependencies: pip install -r marm-mcp-server/requirements_stdio.txt
  3. Run the stdio server (example; replace with your platform and script path): <platform> mcp add --transport stdio marm-memory-stdio python "your/file/path/to/marm-mcp-server/server_stdio.py"

Docker (alternative):

  1. Pull the image: docker pull lyellr88/marm-mcp-server:latest
  2. Run the container (port mapping and volume as needed): docker run -d --name marm-mcp-server -p 8001:8001 -v ~/.marm:/home/marm/.marm lyellr88/marm-mcp-server:latest
  3. Connect clients to the HTTP endpoint: claude mcp add --transport http marm-memory http://localhost:8001/mcp

Additional notes

Tips and common considerations:

  • Version: This setup references marm-mcp-server==2.2.6; ensure client integrations match compatible MCP protocol versions.
  • HTTP vs STDIO: Use HTTP for cross-process or networked clients; use STDIO for local pipelines and embedded workflows.
  • Persistence: Memory data persists across sessions and restarts. Ensure your data directory (e.g., ~/.marm) is writable and backed up as needed.
  • Docker usage: When using Docker, ensure proper volume mapping for memory data and health/readiness checks to keep the MCP server stable.
  • Environment variables: PYTHONUNBUFFERED=1 helps with real-time logs; add other vars if you have specific hosting or security requirements.
  • Health checks: Consider adding readinessProbe/health endpoints in front of the MCP server when deploying in orchestrated environments.
  • MCP Tools: The server exposes 18 MCP Tools covering memory creation, recall, classification, and cross-session organization; consult MARM-HANDBOOK for a detailed tool reference.

Related MCP Servers

Sponsor this space

Reach thousands of developers