koog
Koog is the official Kotlin framework for building predictable, fault-tolerant and enterprise-ready AI agents across all platforms – from backend services to Android and iOS, JVM, and even in-browser environments. Koog is based on our AI products expertise and provides proven solutions for complex LLM and AI problems
claude mcp add --transport stdio jetbrains-koog docker run -i jetbrains/koog:latest \ --env GOOGLE_API_KEY="your-google-api-key" \ --env OPENAI_API_KEY="your-openai-api-key" \ --env WEAVE_ENDPOINT="https://your-weave-endpoint.example.com" \ --env LANGFUSE_API_KEY="your-langfuse-api-key" \ --env ANTHROPIC_API_KEY="your-anthropic-api-key" \ --env OPENROUTER_API_KEY="your-openrouter-api-key"
How to use
Koog is a Kotlin-based framework for building and running AI agents entirely within Kotlin. This MCP server exposes Koog as a service you can run in your MCP manager, enabling you to deploy, monitor, and orchestrate Koog-powered agents that interact with tools, manage workflows, and communicate with users. The server leverages Koog's modular architecture, allowing agents to persist state, handle retries, and switch between multiple language models or providers on the fly. By wiring Koog into an MCP workflow, you can chain agent-powered tasks with external tools and memory stores, all inside your existing MCP runtime.
To use the Koog MCP server, you’ll typically run a Docker image that provides a Koog-enabled JVM runtime along with the necessary LLM integrations. After starting the server, configure your clients to send agent tasks and tool invocations to Koog via the MCP protocol. Koog supports a broad set of capabilities: multi-platform agent execution, distributed tool usage, streaming responses, memory and history management, and integration with popular JVM frameworks such as Spring Boot and Ktor. You can supply an OpenAI, Anthropic, Google, OpenRouter, Ollama, Bedrock, or similar LLM provider, and Koog will manage the interaction flow, retries, and state restoration for long-running conversations.
How to install
Prerequisites:
- Docker is installed and running on your host.
- An MCP-compatible environment to host the server (preferred: Docker).
- Optional: API keys for your preferred LLM providers (OpenAI, Anthropic, Google, OpenRouter, etc.).
Step-by-step:
-
Pull and run the Koog MCP server image via Docker:
docker pull jetbrains/koog:latest docker run -it --rm
-e OPENAI_API_KEY="your-openai-api-key"
-e ANTHROPIC_API_KEY="your-anthropic-api-key"
-e GOOGLE_API_KEY="your-google-api-key"
-e OPENROUTER_API_KEY="your-openrouter-api-key"
-e LANGFUSE_API_KEY="your-langfuse-api-key"
-e WEAVE_ENDPOINT="https://your-weave-endpoint.example.com"
jetbrains/koog:latest -
If you prefer to run Koog as a container in an MCP environment, ensure Docker is accessible to the MCP runtime and configure the mcp_config accordingly (see the mcp_config section). You can also build and run Koog directly in a JVM environment if you want to integrate it more tightly with a Kotlin project, but that typically requires a Java-based run approach rather than a prebuilt MCP image.
-
For local development, you can clone Koog’s repository, build with Gradle/Maven as described in the project docs, and run the JVM artifacts directly. Ensure you target JDK 17+ and align coroutine and serialization library versions as noted in Koog’s requirements.
Additional notes
Tips and caveats:
- Ensure your chosen LLM provider keys are available in the environment where the Koog MCP server runs. Koog will use these keys to initialize the appropriate executors.
- If you enable streaming responses, verify network policies allow streaming data from the LLM providers.
- Koog supports persistence and retries; configure your state stores and retry policies according to your workload characteristics.
- When integrating with MCP, you may want to expose Koog’s tooling endpoints via a managed container to simplify scaling and observability.
- Common issues include missing environment variables, incorrect API keys, or network restrictions preventing access to LLM or memory services. Check logs for provider authentication errors and adjust env vars accordingly.
Related MCP Servers
bytebot
Bytebot is a self-hosted AI desktop agent that automates computer tasks through natural language commands, operating within a containerized Linux desktop environment.
voltagent
AI Agent Engineering Platform built on an Open Source TypeScript AI Agent Framework
sdk-python
A model-driven approach to building AI agents in just a few lines of code.
ag2
AG2 (formerly AutoGen): The Open-Source AgentOS. Join us at: https://discord.gg/sNGSwQME3x
tools
A set of tools that gives agents powerful capabilities.
samples
Agent samples built using the Strands Agents SDK.