Get the FREE Ultimate OpenClaw Setup Guide →

mcp-web-ui

MCP Web UI is a web-based user interface that serves as a Host within the Model Context Protocol (MCP) architecture. It provides a powerful and user-friendly interface for interacting with Large Language Models (LLMs) while managing context aggregation and coordination between clients and servers.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio megagrindstone-mcp-web-ui go run ./cmd/server/main.go \
  --env OPENAI_API_KEY="OpenAI API key" \
  --env ANTHROPIC_API_KEY="Anthropic API key" \
  --env OPENROUTER_API_KEY="OpenRouter API key"

How to use

MCP Web UI acts as a Host within the Model Context Protocol (MCP) ecosystem, offering a web-based interface to interact with multiple large language model (LLM) providers. It provides real-time, streaming chat experiences via Server-Sent Events (SSE) and supports robust context aggregation and coordination between clients and MCP servers. Users can configure multiple LLM providers (Anthropic, OpenAI, Ollama, and OpenRouter) and manage model selection, prompts, and token behavior through a centralized UI. The UI also includes persistent chat history storage (BoltDB) and dynamic configuration management to tailor LLM behavior and context handling for complex multi-client workloads.

To use it, run the MCP Web UI server and open the web interface in your browser. In the UI, you can select the provider (Anthropic, OpenAI, Ollama, OpenRouter), choose a model, and adjust generation parameters such as temperature, max tokens, and stop sequences. You can also configure the global system prompt and a title generator prompt to customize chat titles. The interface exposes an option to stream responses in real time, enabling a responsive chat experience, and provides tools to inspect and manage active MCP servers and their SSE streams.

How to install

Prerequisites

  • Go 1.23+ installed on your machine
  • Git
  • Optional: Docker if you prefer containerized deployment
  • API keys for desired LLM providers (Anthropic, OpenAI, OpenRouter) and corresponding environment variables
  1. Clone the repository

  2. Install dependencies and build/run locally

    • Set up environment variables as needed (example shown below)
    • Export example keys (replace with your actual keys):
      • export ANTHROPIC_API_KEY=your_anthropic_key
      • export OPENAI_API_KEY=your_openai_key
      • export OPENROUTER_API_KEY=your_openrouter_key
  3. Run the server locally (development)

    • go mod download
    • go run ./cmd/server/main.go
  4. Optional: Docker deployment

    • Build image: docker build -t mcp-web-ui .
    • Run container: docker run -p 8080:8080
      -v $HOME/.config/mcpwebui/config.yaml:/app/config.yaml
      -e ANTHROPIC_API_KEY
      -e OPENAI_API_KEY
      -e OPENROUTER_API_KEY
      mcp-web-ui
  5. Configure the application

    • Create and edit config.yaml in your config directory (e.g., ~/.config/mcpwebui/config.yaml) to customize server port, logging, LLM providers, and MCP server configurations as described in the README.

Additional notes

Tips and notes:

  • Ensure your API keys are set as environment variables or configured in config.yaml per the provider you use.
  • The MCP Web UI supports multiple LLM providers; configure each provider under the llm section in config.yaml and tailor provider-specific options (e.g., maxTokens, temperature).
  • If running via Docker, mount your config.yaml to /app/config.yaml so the server loads your settings.
  • BoltDB is used for persistent chat history; ensure the path has write permissions when running on systems with restricted directories.
  • For local Ollama usage, ensure the Ollama server is running and the host/port are reachable as configured under the Ollama provider settings.
  • When using OpenAI or OpenRouter, consider network latency and rate limits; the server supports streaming responses via SSE for a responsive UI experience.

Related MCP Servers

Sponsor this space

Reach thousands of developers