mcp-web-ui
MCP Web UI is a web-based user interface that serves as a Host within the Model Context Protocol (MCP) architecture. It provides a powerful and user-friendly interface for interacting with Large Language Models (LLMs) while managing context aggregation and coordination between clients and servers.
claude mcp add --transport stdio megagrindstone-mcp-web-ui go run ./cmd/server/main.go \ --env OPENAI_API_KEY="OpenAI API key" \ --env ANTHROPIC_API_KEY="Anthropic API key" \ --env OPENROUTER_API_KEY="OpenRouter API key"
How to use
MCP Web UI acts as a Host within the Model Context Protocol (MCP) ecosystem, offering a web-based interface to interact with multiple large language model (LLM) providers. It provides real-time, streaming chat experiences via Server-Sent Events (SSE) and supports robust context aggregation and coordination between clients and MCP servers. Users can configure multiple LLM providers (Anthropic, OpenAI, Ollama, and OpenRouter) and manage model selection, prompts, and token behavior through a centralized UI. The UI also includes persistent chat history storage (BoltDB) and dynamic configuration management to tailor LLM behavior and context handling for complex multi-client workloads.
To use it, run the MCP Web UI server and open the web interface in your browser. In the UI, you can select the provider (Anthropic, OpenAI, Ollama, OpenRouter), choose a model, and adjust generation parameters such as temperature, max tokens, and stop sequences. You can also configure the global system prompt and a title generator prompt to customize chat titles. The interface exposes an option to stream responses in real time, enabling a responsive chat experience, and provides tools to inspect and manage active MCP servers and their SSE streams.
How to install
Prerequisites
- Go 1.23+ installed on your machine
- Git
- Optional: Docker if you prefer containerized deployment
- API keys for desired LLM providers (Anthropic, OpenAI, OpenRouter) and corresponding environment variables
-
Clone the repository
- git clone https://github.com/MegaGrindStone/mcp-web-ui.git
- cd mcp-web-ui
-
Install dependencies and build/run locally
- Set up environment variables as needed (example shown below)
- Export example keys (replace with your actual keys):
- export ANTHROPIC_API_KEY=your_anthropic_key
- export OPENAI_API_KEY=your_openai_key
- export OPENROUTER_API_KEY=your_openrouter_key
-
Run the server locally (development)
- go mod download
- go run ./cmd/server/main.go
-
Optional: Docker deployment
- Build image: docker build -t mcp-web-ui .
- Run container: docker run -p 8080:8080
-v $HOME/.config/mcpwebui/config.yaml:/app/config.yaml
-e ANTHROPIC_API_KEY
-e OPENAI_API_KEY
-e OPENROUTER_API_KEY
mcp-web-ui
-
Configure the application
- Create and edit config.yaml in your config directory (e.g., ~/.config/mcpwebui/config.yaml) to customize server port, logging, LLM providers, and MCP server configurations as described in the README.
Additional notes
Tips and notes:
- Ensure your API keys are set as environment variables or configured in config.yaml per the provider you use.
- The MCP Web UI supports multiple LLM providers; configure each provider under the llm section in config.yaml and tailor provider-specific options (e.g., maxTokens, temperature).
- If running via Docker, mount your config.yaml to /app/config.yaml so the server loads your settings.
- BoltDB is used for persistent chat history; ensure the path has write permissions when running on systems with restricted directories.
- For local Ollama usage, ensure the Ollama server is running and the host/port are reachable as configured under the Ollama provider settings.
- When using OpenAI or OpenRouter, consider network latency and rate limits; the server supports streaming responses via SSE for a responsive UI experience.
Related MCP Servers
adk-go
An open-source, code-first Go toolkit for building, evaluating, and deploying sophisticated AI agents with flexibility and control.
trpc-agent-go
trpc-agent-go is a powerful Go framework for building intelligent agent systems using large language models (LLMs) and tools.
station
Station is our open-source runtime that lets teams deploy agents on their own infrastructure with full control.
moling
MoLing is a computer-use and browser-use based MCP server. It is a locally deployed, dependency-free office AI assistant.
go-utcp
Official Go implementation of the UTCP
mcp-shell
Give hands to AI. MCP server to run shell commands securely, auditably, and on demand.