Get the FREE Ultimate OpenClaw Setup Guide →

go-llm

Large Language Model API interface

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio mutablelogic-go-llm docker run -d --name go-llm -v go-llm:/data -p 8085:8085 -e GEMINI_API_KEY=your-key ghcr.io/mutablelogic/llm run

How to use

go-llm is a multi-provider LLM server that exposes a REST API, a CLI, and an MCP (Model Context Protocol) interface. It supports providers such as Google Gemini, Anthropic Claude, and Mistral, and includes tools for Home Assistant, NewsAPI, and WeatherAPI. The MCP server component allows registered tools and prompts to be discovered and invoked via JSON-RPC 2.0 over stdio or compatible MCP transports. To use the MCP interface in your workflow, run the server (via Docker as shown in the Quick Start) and connect an MCP client to the server’s address and port. The server also offers an HTTP API for models, sessions, tools, chat, ask, and embeddings, with optional streaming via SSE. For common tasks, you can enumerate tools, fetch model details, manage sessions, and perform embeddings or multi-turn conversations through the MCP layer or the HTTP API.

How to install

Prerequisites:

  • Docker installed and running
  • Access to a Gemini API key (or other provider keys as needed)

Step-by-step:

  1. Install Docker on your system and start the daemon.
  2. Create a data volume for persistent storage (optional but recommended): docker volume create go-llm
  3. Run the server in detached mode with the necessary environment variable for Gemini (or other providers) keys: docker run -d --name go-llm
    -v go-llm:/data -p 8085:8085
    -e GEMINI_API_KEY="your-key"
    ghcr.io/mutablelogic/llm run
  4. Verify the server is reachable at http://localhost:8085/api (adjust port if you changed it).
  5. (Optional) If you want to use the CLI client, install the client and point it to the running server: export LLM_ADDR="localhost:8085"

Additional notes

Tips and common issues:

  • Ensure your GEMINI_API_KEY (and other provider keys) are correctly set in the environment when starting the container.
  • The MCP server exposes registered tools and prompts; ensure tools are properly configured and accessible by the server.
  • If you rely on TLS, configure TLS-related flags/ENV vars as described in the HTTP/TLS section of the README and map ports accordingly.
  • For persistent sessions, the server stores data in the configured data volume; back up the volume if needed.
  • When using the MCP interface, you can drive multi-turn conversations and tool calls via the stdio JSON-RPC 2.0 protocol, enabling integration with external orchestrators or automation pipelines.

Related MCP Servers

Sponsor this space

Reach thousands of developers