obot
Complete MCP Platform -- Hosting, Registry, Gateway, and Chat Client
claude mcp add --transport stdio obot-platform-obot docker run -d --name obot -p 8080:8080 -v /var/run/docker.sock:/var/run/docker.sock -e OPENAI_API_KEY=<API KEY> -e ANTHROPIC_API_KEY=<ANTHROPIC_API_KEY> ghcr.io/obot-platform/obot:latest \ --env OPENAI_API_KEY="Your OpenAI API key" \ --env ANTHROPIC_API_KEY="Your Anthropic API key (optional)"
How to use
Obot is an open-source platform that lets you host MCP servers, publish them to a central registry, and access them through a unified MCP gateway and chat client. With Obot running, you can deploy MCP servers locally or in a managed environment via Docker, Kubernetes, or other supported runtimes, and then manage access, authentication, and auditing from the built-in admin UI. The Obot Chat component provides a standardized chat interface that supports leading model providers (e.g., OpenAI, Anthropic) and can leverage domain-specific information through RAG, project-wide memory, and reusable configurations. Organizations can discover and use MCP servers via the registry, while IT admins control deployment rights, catalog publishing, and sharing policies.
To start using Obot, open the UI at http://localhost:8080 once the container is running. From there you can register MCP servers (Node.js, Python, or container-based), publish them to the catalog, and configure access rules. The MCP gateway provides a single entry point to MCP servers with request filtering, logging, and usage visibility, helping you monitor which servers are being used and by whom. The platform also supports integrating with workflow tools and agents (e.g., n8n, LangGraph) to automate interactions with MCP servers, and it enables clients like ChatGPT or Claude Desktop to connect to your hosted MCP ecosystem through the gateway.
How to install
Prerequisites:
- Docker installed on your host machine
- Access to the internet to pull the Obot image
- API keys for model providers (e.g., OpenAI, Anthropic) if you plan to use external models
- Install Docker
- Follow the official Docker installation guide for your OS: https://docs.docker.com/get-docker/
- Run the Obot container
- Example (adjust keys as needed):
docker run -d --name obot -p 8080:8080 \
-v /var/run/docker.sock:/var/run/docker.sock \
-e OPENAI_API_KEY=<API KEY> \
ghcr.io/obot-platform/obot:latest
- Access the Obot UI
- Open your browser and navigate to http://localhost:8080
- Configure keys and providers
- In the Obot admin UI, set your OpenAI or Anthropic keys and configure model providers as needed.
- Optionally set up Anthropic_API_KEY or other provider credentials as required by your deployment.
- Optional: advanced deployment
- For Kubernetes or other orchestrators, follow the Installation Guide linked in the Obot documentation for alternative deployment methods.
Additional notes
Tips and common considerations:
- If you use a private registry, ensure authentication is configured in the Obot admin UI and that the registry supports MCP server metadata standards.
- Remember to expose port 8080 only to trusted networks or behind a reverse proxy with TLS termination for production deployments.
- Monitor the OAuth 2.1 and token handling configurations to maintain secure access control across users and groups.
- When using multiple model providers, verify rate limits and pricing to avoid unexpected costs.
- The MCP gateway provides logging and request inspection; enable appropriate logging verbosity in production to aid debugging and auditing.
- Ensure the host has sufficient CPU/memory resources to run MCP servers and the Obot gateway concurrently, especially if you deploy multi-user HTTP servers.
Related MCP Servers
kagent
Cloud Native Agentic AI | Discord: https://bit.ly/kagentdiscord
mcp-language
mcp-language-server gives MCP enabled clients access semantic tools like get definition, references, rename, and diagnostics.
kodit
👩💻 MCP server to index external repositories
mcp-web-ui
MCP Web UI is a web-based user interface that serves as a Host within the Model Context Protocol (MCP) architecture. It provides a powerful and user-friendly interface for interacting with Large Language Models (LLMs) while managing context aggregation and coordination between clients and servers.
mcp-auth-proxy
MCP Auth Proxy is a secure OAuth 2.1 authentication proxy for Model Context Protocol (MCP) servers
go
A Go implementation of the Model Context Protocol (MCP) - an open protocol that enables seamless integration between LLM applications and external data sources and tools.