Get the FREE Ultimate OpenClaw Setup Guide →

allman

High-performance MCP agent mail server — lock-free message routing, NRT search, Git audit trail. Built in Rust.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio copyleftdev-allman docker run -p 8000:8000 -p 8001:8001 --env INDEX_PATH=allman_index --env REPO_ROOT=allman_repo --env RUST_LOG=allman=debug --env VLLM_PORT=8001 copyleftdev/allman

How to use

Allman exposes an MCP-compatible API on http://localhost:8000/mcp. It acts as a high-performance mail agent hub that coordinates agents, routes messages, and provides a full-text search-backed inbox. You can register agents, send messages between agents, fetch unread inbox items, and perform full-text searches across all stored messages. The system emphasizes a lock-free hot path for create_agent, send_message, and get_inbox, ensuring microsecond-latency operations for common actions. If you’re using the Docker setup, ensure the container is running and that port 8000 is accessible from your client.

The MCP tools available are:

  • create_agent: Register an agent with project_key, name_hint, program, and model.
  • send_message: Deliver a message from one agent to another, including subject and body, plus project context.
  • get_inbox: Retrieve and drain unread messages for a given agent.
  • search_messages: Perform a full-text search across all indexed messages using Tantivy-backed search.

Clients can connect to the /mcp endpoint via JSON-RPC or a compatible MCP client library to invoke these tools, receive structured responses, and handle errors accordingly.

How to install

Prerequisites:

  • Docker (recommended) or Rust toolchain for building from source
  • Optional: NVIDIA GPU with nvidia-container-toolkit if you plan to run vLLM inference

Option A - Install with Docker (recommended):

  1. Pull and run the Allman Docker image (adjust image name if necessary): docker pull copyleftdev/allman docker compose up -d # if using docker-compose.yml configured for Allman

    Or run a single container with port mappings:

    docker run -d --name allman -p 8000:8000 -p 8001:8001 -e INDEX_PATH=allman_index -e REPO_ROOT=allman_repo -e RUST_LOG=allman=debug copyleftdev/allman
  2. Verify the service is listening on port 8000 (and 8001 for vLLM if used).

Option B - Build from source (Rust):

  1. Prerequisites:
    • Rust 1.75+ and Cargo
    • Optional: Docker for containerized deployment
  2. Install and build: git clone https://github.com/copyleftdev/allman.git cd allman cargo build --release
  3. Run locally: cargo run --release

    Server binds to 0.0.0.0:8000 by default; ensure environment variables as needed.

  4. Configuration: Set environment variables as needed (INDEX_PATH, REPO_ROOT, RUST_LOG). You can also configure via a .env file matching the README’s guidance.

Additional notes

Environment variables:

  • INDEX_PATH: Tantivy index directory (default allman_index)
  • REPO_ROOT: Git audit trail directory (default allman_repo)
  • RUST_LOG: Log level filter (default allman=debug) Networking:
  • Exposed ports: 8000 for MCP API, 8001 for vLLM inference (when used). If you’re running behind a reverse proxy, ensure proper routing and TLS termination as needed. Persistence:
  • Allman uses a batched indexing pipeline with a separate OS thread for Git commits. The hot path is lock-free for low latency operations, but ensure the filesystem paths for index and repo directories are writable by the process. Common issues:
  • If /mcp endpoint is unreachable, verify container is running and ports are mapped correctly.
  • When using GPU inference, ensure the NVIDIA toolkit is installed and port 8001 is accessible to your vLLM instance.
  • If index or repo directories are missing, create them or set INDEX_PATH/REPO_ROOT to valid writable paths.

Related MCP Servers

Sponsor this space

Reach thousands of developers