MARM-Systems
Turn AI into a persistent, memory-powered collaborator. Universal MCP Server (supports HTTP, STDIO, and WebSocket) enabling cross-platform AI memory, multi-agent coordination, and context sharing. Built with MARM protocol for structured reasoning that evolves with your work.
claude mcp add --transport stdio lyellr88-marm-systems python marm-mcp-server \ --env PYTHONUNBUFFERED="1"
How to use
MARM MCP Server provides a persistent memory layer that sits underneath AI agents, enabling long-term recall, session continuity, and structured memory across multiple tools. The server can operate over HTTP (for external clients) or via STDIO (for integration with local pipelines and agent processes). With the MARM MCP setup, you can connect various AI clients (e.g., Claude, Gemini, Qwen) through an MCP transport and leverage universal memory, notebooks, and semantic search to recall decisions, configurations, code, and rationale across sessions and tools. The Quick Start demonstrates both Docker and Python-based HTTP/STDIO usage, including commands to join the memory layer to an AI transport for cross-agent memory sharing and seamless recall across agents. The server exposes 18 MCP tools, which provide complete MCP coverage for common memory and session-management tasks, enabling unified memory across different AI clients.
To use it, install the server via Python (pip) or Docker, start the server, and then add an MCP transport (for example, with claude mcp add --transport http marm-memory http://localhost:8001/mcp). This establishes a memory-backed context that your agents can query and update, ensuring persistent context even across restarts. When using HTTP, you can point clients to http://localhost:8001/mcp to leverage semantic search, structured sessions, and cross-agent recall; via STDIO, you integrate memory memory directly into local AI workflows with the stdio transport and provided Python path to the server.
Overall, MARM MCP acts as a universal, persistent memory backbone for diverse AI clients, enabling cross-tool recall, unified session logs, and reusable notebooks so agents can remember, reference, and build on prior work together.
How to install
Prerequisites:
- Python 3.10+ installed on your system
- pip available in your PATH
- Optional: Docker if you prefer containerized deployment
Install (HTTP/Local Python) with pip:
- Install the server package: pip install marm-mcp-server==2.2.6
- Install dependencies: pip install -r marm-mcp-server/requirements.txt
- Run the MCP server: python marm-mcp-server
- Connect an MCP transport (example with Claude): claude mcp add --transport http marm-memory http://localhost:8001/mcp
Install (STDIO) if you plan to use the stdio transport:
- Install the server package: pip install marm-mcp-server==2.2.6
- Install stdio dependencies: pip install -r marm-mcp-server/requirements_stdio.txt
- Run the stdio server (example; replace with your platform and script path): <platform> mcp add --transport stdio marm-memory-stdio python "your/file/path/to/marm-mcp-server/server_stdio.py"
Docker (alternative):
- Pull the image: docker pull lyellr88/marm-mcp-server:latest
- Run the container (port mapping and volume as needed): docker run -d --name marm-mcp-server -p 8001:8001 -v ~/.marm:/home/marm/.marm lyellr88/marm-mcp-server:latest
- Connect clients to the HTTP endpoint: claude mcp add --transport http marm-memory http://localhost:8001/mcp
Additional notes
Tips and common considerations:
- Version: This setup references marm-mcp-server==2.2.6; ensure client integrations match compatible MCP protocol versions.
- HTTP vs STDIO: Use HTTP for cross-process or networked clients; use STDIO for local pipelines and embedded workflows.
- Persistence: Memory data persists across sessions and restarts. Ensure your data directory (e.g., ~/.marm) is writable and backed up as needed.
- Docker usage: When using Docker, ensure proper volume mapping for memory data and health/readiness checks to keep the MCP server stable.
- Environment variables: PYTHONUNBUFFERED=1 helps with real-time logs; add other vars if you have specific hosting or security requirements.
- Health checks: Consider adding readinessProbe/health endpoints in front of the MCP server when deploying in orchestrated environments.
- MCP Tools: The server exposes 18 MCP Tools covering memory creation, recall, classification, and cross-session organization; consult MARM-HANDBOOK for a detailed tool reference.
Related MCP Servers
deepcontext
DeepContext is an MCP server that adds symbol-aware semantic search to Claude Code, Codex CLI, and other agents for faster, smarter context on large codebases.
sandboxed.sh
Self-hosted orchestrator for AI autonomous agents. Run Claude Code & Open Code in isolated linux workspaces. Manage your skills, configs and encrypted secrets with a git repo.
claude-code-open
Open source AI coding platform with Web IDE, multi-agent system, 37+ tools, MCP protocol. MIT licensed.
sub-agents
Define task-specific AI sub-agents in Markdown for any MCP-compatible tool.
vibe-check
Stop AI coding disasters before they cost you weeks. Real-time anti-pattern detection for vibe coders who love AI tools but need a safety net to avoid expensive overengineering traps.
claude-vigil
🏺 An MCP server for checkpointing and file recovery in Claude Code