mcp-ragchat
MCP server that adds RAG-powered AI chat to any website. One command from Claude Code. Local vector store, multi-provider LLM (OpenAI/Anthropic/Gemini). Zero cloud dependency.
claude mcp add --transport stdio gogabrielordonez-mcp-ragchat node /absolute/path/to/mcp-ragchat/dist/mcp-server.js \ --env OPENAI_API_KEY="sk-..."
How to use
mcp-ragchat is an MCP server that enables a self-contained RAG-powered AI chat on your website. It runs locally and uses a local vector store to index content you provide, then serves an embeddable chat widget and a chat API to interact with the loaded data using a large language model. The server exposes tools such as ragchat_setup to seed your knowledge base from Markdown content, ragchat_test to verify retrieval quality, ragchat_serve to start a local HTTP chat server, ragchat_widget to generate a self-contained embed snippet, and ragchat_status to inspect configured domains and their document counts. To use it, you configure Claude Code (or your MCP client) with the server command, point it at the compiled server file, and supply necessary environment variables (e.g., your OpenAI API key). The flow is: seed or load your content into a local vector store, start the chat server, and optionally embed the widget on your site for visitors to chat with the indexed content.
How to install
Prerequisites:
- Node.js v20 or newer
- npm or pnpm
Step-by-step:
- Clone the repository: git clone https://github.com/gogabrielordonez/mcp-ragchat
- Install dependencies and build: cd mcp-ragchat npm install npm run build
- Run the MCP server via your MCP client (see mcp_config for exact command). Ensure you have required environment variables available, such as OPENAI_API_KEY:
- OPENAI_API_KEY: your OpenAI API key
- Test locally using the built-in tools (ragchat_setup, ragchat_test) and then start the chat server (ragchat_serve) and generate a widget (ragchat_widget) as needed.
Additional notes
Tips and notes:
- The server stores data locally under ~/.mcp-ragchat/domains/ per domain you configure, with vectors.json and config.json tracking documents and settings.
- Ensure OPENAI_API_KEY (or other provider keys) is available in the environment where the server runs.
- The widget script is self-contained and does not require external dependencies beyond the served chat endpoint.
- If you modify content, re-run ragchat_setup to re-seed and re-embed the documents.
- You can override default LLM and embedding models via LLM_MODEL and EMBEDDING_MODEL environment variables when applicable.
- For Claude Code integration, provide the absolute path to the built mcp-server.js in the command argument path, as shown in the quick start example.
Related MCP Servers
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
SearChat
Search + Chat = SearChat(AI Chat with Search), Support OpenAI/Anthropic/VertexAI/Gemini, DeepResearch, SearXNG, Docker. AI对话式搜索引擎,支持DeepResearch, 支持OpenAI/Anthropic/VertexAI/Gemini接口、聚合搜索引擎SearXNG,支持Docker一键部署。
ollama
An MCP Server for Ollama
coplay-unity-plugin
Unity plugin for Coplay
mcp-tasks
A comprehensive and efficient MCP server for task management with multi-format support (Markdown, JSON, YAML)
metabase-ai-assistant
🚀 The most powerful MCP Server for Metabase - 111+ tools for AI SQL generation, dashboard automation & enterprise BI. Works with Claude, Cursor, ChatGPT.