Get the FREE Ultimate OpenClaw Setup Guide →

open-chat

Roll your own chat with just a few lines of code. Open source alternative to ChatKit. Use any model / provider. Built upon AI Elements from Vercel, AI SDK. Supports MCP, MCP-UI, and the MCP Registry out of the box.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio faith-tools-open-chat node server.js \
  --env VITE_SERVER_URL="http://localhost:3000" \
  --env VITE_REGISTRY_URL="https://registry.modelcontextprotocol.io" \
  --env VITE_OPEN_ROUTER_API_KEY="your-open-router-api-key (optional)"

How to use

OpenChat is an MCP-enabled chat component designed to work with any AI model provider via a flexible transport layer. It ships with tooling to integrate MCP servers directly into the chat experience, and supports connecting to the MCP Registry so users can discover and leverage multiple models or tool configurations. The README emphasizes using OpenRouter-compatible transports and the ability to plug in MCP servers through the user interface, enabling scenarios like delegating specific tasks to external MCP-backed tools or model pools. To get started, install dependencies, set up environment variables, and run the server and client components. Once running, you can use the OpenChat component with or without authentication, enable MCP tools integration, and optionally point it at a registry to fetch model options and tool definitions.

How to install

Prerequisites\n- Node.js (or Bun) installed on your machine.\n- Git to clone the repository.\n- Optional: a local or remote server for API endpoints if you want to customize the backend.\n\nInstall and run locally\n1) Clone the repository\n git clone https://github.com/faith-tools/open-chat.git\n cd open-chat\n\n2) Install dependencies (example uses Bun as in the README, you can also use npm/yarn if preferred)\n bun install\n\n3) Set up environment variables\n - Copy example env files to actual envs (as shown in the README)\n cp examples/web/.env.example examples/web/.env\n cp examples/server/.env.example examples/server/.env\n - Edit the resulting .env files to configure server URLs and any required tokens.\n\n4) Build assets (if applicable)\n bun run build\n\n5) Run the app locally\n bun run dev\n\n6) Verify the MCP integration\n - Ensure the MCP server is reachable at the configured API endpoint (e.g., http://localhost:3000/api/chat) and that the registry URL (if used) is accessible.\n\nNote: If you prefer a non-Bun setup, you can adapt the commands to npm/yarn by installing dependencies with npm install and running npm run build / npm run dev as appropriate.

Additional notes

Tips and considerations:\n- MCP integration: The OpenChat component can be enhanced with MCP tools by configuring mcpServers via the UI. You can point to a registry such as https://registry.modelcontextprotocol.io to fetch model and tool definitions.\n- Transport and models: You can override transports or lock a specific model by using useChatOptions. When a model is locked, the model picker is hidden from the UI.\n- Security: If you embed tokens or API keys in the browser via useChatOptions, ensure tokens are short-lived and rotated regularly. Consider proxying sensitive requests through a backend you control.\n- OAuth and authentication: The README notes a flow where you handle OAuth yourself and provide the token to the chat options. This can be useful if you want to keep credentials out of the client.\n- Model and provider flexibility: The stack is designed to support any AI SDK-compatible transport and allows integrating OpenRouter or other providers by supplying the proper transport and model configuration.\n- Debugging: If you encounter issues with MCP server discovery, verify that the registry URL is reachable and that the server’s environment variables (such as VITE_SERVER_URL) are correctly set.\n- Registry and model options: Use the provided hooks (e.g., useOpenRouterModelOptions) to fetch model options and pass them to the OpenChatComponent to allow end users to select models or lock one in place.

Related MCP Servers

Sponsor this space

Reach thousands of developers