litmus
Official MCP server for configuring Litmus instances.
claude mcp add --transport stdio litmusautomation-litmus-mcp-server docker run -d --name litmus-mcp-server -p 8000:8000 ghcr.io/litmusautomation/litmus-mcp-server:latest \ --env ANTHROPIC_API_KEY="required for Claude/Anthropic integrations"
How to use
Litmus MCP Server exposes an HTTP SSE endpoint for MCP clients and includes a built-in web UI for interacting with Litmus Edge. The server acts as a bridge between large language models and Litmus Edge, enabling device configuration, monitoring, and management via standardized MCP requests. After starting the Docker container, you can reach the Web UI at http://localhost:9000 (when running with the ports 8000 and 9000 exposed) and connect external MCP clients to the SSE endpoint at http://<host>:8000/sse. Tools and integrations described in the README—including Claude Code CLI, Cursor IDE, VS Code / Copilot, and Windsurf—allow you to configure MCP servers, manage connections to Litmus Edge, and send/receive MCP messages using the provided headers and authentication details. The configuration is persisted via a .env file if you mount a volume, enabling re-use across restarts.
How to install
Prerequisites:
- Docker installed on your host (Linux, macOS, or Windows with WSL)
- Optional: a hosted container registry alias or network access to ghcr.io
Step-by-step:
-
Ensure Docker is running on your machine.
-
Create a directory for potential persistent config (optional): mkdir -p /opt/litmus-mcp
-
Run the Litmus MCP Server Docker image: docker run -d --name litmus-mcp-server -p 8000:8000 ghcr.io/litmusautomation/litmus-mcp-server:latest
Notes:
- This exposes the MCP SSE endpoint on port 8000 and the optional Web UI on port 9000 when using the same image and configuration.
- If you need the Web UI, you should also expose port 9000: docker run -d --name litmus-mcp-server -p 8000:8000 -p 9000:9000 ghcr.io/litmusautomation/litmus-mcp-server:latest
-
If you want to preserve UI/configuration between restarts, mount a host file as described in the README: mkdir -p /opt/litmus-mcp touch /opt/litmus-mcp/.env docker run -d --name litmus-mcp-server -p 8000:8000 -p 9000:9000 -v /opt/litmus-mcp/.env:/app/.env ghcr.io/litmusautomation/litmus-mcp-server:latest
-
(Optional) If you’re using a different tag like main or need AMD64 in ARM64 environments, use the appropriate image tag and potentially specify --platform linux/amd64 as needed.
-
Configure clients (Claude Code CLI, Cursor, VS Code, Windsurf) using the example mcp.json snippets in the README, pointing to http://localhost:8000/sse or the appropriate host URL.
Additional notes
Tips and gotchas:
- The Litmus MCP Server is built for linux/AMD64; when running on ARM64 hosts or in environments with a different architecture, specify the --platform linux/amd64 when pulling/running the image if needed.
- If you rely on UI-stored configuration, remember to mount /app/.env from the host to persist settings across container restarts.
- When using Claude Code CLI or other MCP clients, ensure you set the correct MCP_SSE URL and provide the required headers (EDGE_URL, EDGE_API_CLIENT_ID, EDGE_API_CLIENT_SECRET, NATS credentials, InfluxDB credentials, etc.) as shown in the README examples.
- The ANTHROPIC_API_KEY is required if you plan to use Claude or Claude Desktop integrations; keep this value secure and do not commit it to public repositories.
- If you run the server and UI on separate hosts, configure MCP_SSE_URL on the client side to point to the server’s SSE endpoint, e.g., http://<mcp-server-host>:8000/sse.
Related MCP Servers
web-eval-agent
An MCP server that autonomously evaluates web applications.
mcp-neo4j
Neo4j Labs Model Context Protocol servers
Gitingest
mcp server for gitingest
zotero
Model Context Protocol (MCP) server for the Zotero API, in Python
fhir
FHIR MCP Server – helping you expose any FHIR Server or API as a MCP Server.
unitree-go2
The Unitree Go2 MCP Server is a server built on the MCP that enables users to control the Unitree Go2 robot using natural language commands interpreted by a LLM.