Letta
MCP server to manage letta server and comunicate with agents
claude mcp add --transport stdio oculairmedia-letta-mcp-server docker run -i ghcr.io/oculairmedia/letta-mcp-server-rust:rust-latest \ --env PORT="6507" \ --env RUST_LOG="info" \ --env TRANSPORT="stdio or http" \ --env LETTA_BASE_URL="Letta API URL (e.g. http://localhost:8283)" \ --env LETTA_PASSWORD="Letta API password" \ --env RUST_BACKTRACE="0"
How to use
The Letta MCP Server is a high-performance Rust-based MCP server built to work with Letta AI. It exposes a set of consolidated tools that manage agents, memories, sources, jobs, files, and the MCP server lifecycle, all optimized for fast responses and reduced payload sizes. The server supports both a stdio transport suitable for local desktop usage (e.g., Claude Desktop) and an HTTP transport for production deployments. To connect a client like Claude Desktop, Windsurf, or OpenCode, configure the MCP client to point at the server URL and provide the necessary authentication/endpoint environment. The server’s tooling includes seven consolidated tools with a total of 103 operations, covering lifecycle management, memory operations, data sources, and tool management, enabling complex agent workflows directly through MCP.
When running in Docker, you’ll typically map port 6507 (or your chosen port) and provide LETTA_BASE_URL and LETTA_PASSWORD so the server can authenticate with Letta. The available tools (letta_agent_advanced, letta_memory_unified, letta_tool_manager, letta_source_manager, letta_job_monitor, letta_file_folder_ops, letta_mcp_ops) each expose a suite of operations, such as creating and updating agents, managing memory blocks, attaching sources, and managing MCP servers. Tool usage generally follows the familiar MCP operation patterns (e.g., list, get, create, update, delete) with pagination and summary versus full detail modes for efficient data transfer.
How to install
Prerequisites:
- Docker (recommended for quick start) or a Rust toolchain if building from source
- Optional: Node.js environment if you prefer the npm-distributed binary (see npm package below)
Option A — Run with Docker (recommended):
- Pull and run the prebuilt image (as shown in the README examples):
docker pull ghcr.io/oculairmedia/letta-mcp-server-rust:rust-latest
# Run with necessary environment variables
docker run -d \
-p 6507:6507 \
-e LETTA_BASE_URL=http://your-letta-instance:8283 \
-e LETTA_PASSWORD=your-password \
-e TRANSPORT=http \
--name letta-mcp \
ghcr.io/oculairmedia/letta-mcp-server-rust:rust-latest
- Verify the server is healthy by checking port 6507 or the configured port and testing a simple MCP client command.
Option B — Build from Source (Rust):
-
Prerequisites:
- Rust nightly (edition 2024)
- Cargo
-
Clone and build:
git clone https://github.com/oculairmedia/Letta-MCP-server.git
cd Letta-MCP-server
cargo build --release
- Run locally (example):
LETTA_BASE_URL=http://your-letta:8283 \
LETTA_PASSWORD=your-password \
./target/release/letta-server
Option C — npm package (binary distribution):
- Install globally via npm:
npm install -g letta-mcp-server
- Run the prebuilt binary appropriate for your platform (the installer selects the correct binary automatically). Ensure LETTA_BASE_URL and LETTA_PASSWORD are set in your environment before starting.
Note: If you choose Docker, you can adapt the docker run command to your environment, and you can also use docker-compose as shown in the README to configure environment variables and health checks.
Additional notes
Environment variables:
- LETTA_BASE_URL: required. The Letta API URL (e.g. http://localhost:8283).
- LETTA_PASSWORD: required. Letta API password.
- TRANSPORT: optional. Default is stdio; set to http to enable HTTP transport.
- PORT: optional. HTTP port when TRANSPORT=http (default 6507).
- RUST_LOG: optional. Log level (debug, info, warn, error).
- RUST_BACKTRACE: optional. Enable backtraces (0 or 1).
Common issues:
- If the MCP server cannot reach Letta, verify LETTA_BASE_URL and LETTA_PASSWORD are correct and that network access to Letta is allowed.
- When using HTTP transport, ensure the port is open and not blocked by a firewall.
- For Docker, ensure port mappings align with your deployment (e.g., -p 6507:6507).
- If you see JSON payloads that seem large or slow, enable RUST_LOG=debug temporarily to diagnose bottlenecks, and review response size optimization notes in the README.
Configuration tips:
- Use the provided MCP client configurations to connect Claude Desktop, Cursor/Windsurf, or OpenCode to the letta MCP server.
- Leverage the seven tools to manage agents, memories, sources, and MCP server lifecycle efficiently; utilize pagination and get vs list patterns to optimize data transfer.
Related MCP Servers
goose
an open source, extensible AI agent that goes beyond code suggestions - install, execute, edit, and test with any LLM
cunzhi
告别AI提前终止烦恼,助力AI更加持久
probe
AI-friendly semantic code search engine for large codebases. Combines ripgrep speed with tree-sitter AST parsing. Powers AI coding assistants with precise, context-aware code understanding.
mcp-center
A centralized platform for managing and connecting MCP servers. MCP Center provides a high-performance proxy service that enables seamless communication between MCP clients and multiple MCP servers.
backlog -rust
MCP server for Backlog, project management service.
perplexity-web-api
🔍 Perplexity AI MCP without API key