Get the FREE Ultimate OpenClaw Setup Guide →

chat-ui

Single-File AI Chatbot UI with Multimodal & MCP Support: An All-in-One HTML File for a Streamlined Chatbot Conversational Interface

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio ai-ql-chat-ui docker run -d -p 8080:8080 aiql/chat-ui

How to use

The Chat UI MCP server exposes a minimal HTML-based chat frontend that can render conversations and interact with back-end LLMs using the MCP (Model Context Protocol) IPC model. It supports various OpenAI-format requests and can connect to backends such as HuggingFace Text Generation Inference (TGI) or vLLM, automatically handling multiple response formats (OpenAI-style, Cloudflare AI style, or plain text). It also includes features like chat history export, interrupting a generation, and re-running previous generations to test backend inferences. When run as an MCP renderer, it can communicate with a desktop or main process via IPC, enabling interactive workflows between the front-end UI and a local or remote LLM backend. You can use the built-in demo, run the app locally from a static index.html, or deploy via Docker, Hugging Face Spaces, or Kubernetes.

To use it via Docker, run the provided Docker command and open the mapped port in your browser. If you prefer local testing without Docker, you can serve the single HTML file using Python’s simple HTTP server or any static hosting and point the UI to your configured OpenAI-compatible endpoint. The MCP integration is designed to work as a renderer in a desktop workflow or as a standalone web UI that connects to your backend via API-compatible calls.

How to install

Prerequisites:

  • Docker (recommended) or a local static HTTP server for the index.html (e.g., Python 3.x)
  • Internet access to pull the Docker image aiql/chat-ui if using Docker
  • Basic knowledge of configuring your OpenAI-compatible backend endpoints

Install options:

Option A: Run with Docker

  1. Ensure Docker is installed and running.
  2. Start the container (the command mirrors the recommended run): docker run -d -p 8080:8080 aiql/chat-ui
  3. Open http://localhost:8080 in your browser.

Option B: Serve locally from index.html

  1. Clone or download the repository containing index.html.
  2. In the directory with index.html, run: python3 -m http.server 8000
  3. Open http://localhost:8000 in your browser and configure the backend API endpoint as needed.

Option C: Deploy to Cloud or CI environments

  • Use the provided Docker approach in your deployment script or CI workflow.
  • For Cloudflare Pages or Hugging Face Spaces, follow their hosting guides and ensure app_port is set to 8080 if required by the hosting environment.

Additional notes

Tips & common issues:

  • If you’re using the MCP integration, ensure your desktop backend or main process is running and accessible via IPC as expected by the renderer.
  • The UI supports multiple backend formats; ensure your endpoint is compatible with the OpenAI-style response format if you want seamless prompts and completions.
  • When running via Docker, map the container port to 8080 (or your chosen port) and expose that port publicly if you need remote access.
  • If you encounter CORS or network issues, confirm the backend URL is reachable from the frontend and that any required authentication tokens are provided.
  • For multilingual or i18n features, configure the localization options in the UI as needed.
  • If you modify index.html, ensure you keep the same port and endpoint configuration unless you update the deployment accordingly.

Related MCP Servers

Sponsor this space

Reach thousands of developers