Get the FREE Ultimate OpenClaw Setup Guide →

mcp-ragchat

MCP server that adds RAG-powered AI chat to any website. One command from Claude Code. Local vector store, multi-provider LLM (OpenAI/Anthropic/Gemini). Zero cloud dependency.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio gogabrielordonez-mcp-ragchat node /absolute/path/to/mcp-ragchat/dist/mcp-server.js \
  --env OPENAI_API_KEY="sk-..."

How to use

mcp-ragchat is an MCP server that enables a self-contained RAG-powered AI chat on your website. It runs locally and uses a local vector store to index content you provide, then serves an embeddable chat widget and a chat API to interact with the loaded data using a large language model. The server exposes tools such as ragchat_setup to seed your knowledge base from Markdown content, ragchat_test to verify retrieval quality, ragchat_serve to start a local HTTP chat server, ragchat_widget to generate a self-contained embed snippet, and ragchat_status to inspect configured domains and their document counts. To use it, you configure Claude Code (or your MCP client) with the server command, point it at the compiled server file, and supply necessary environment variables (e.g., your OpenAI API key). The flow is: seed or load your content into a local vector store, start the chat server, and optionally embed the widget on your site for visitors to chat with the indexed content.

How to install

Prerequisites:

  • Node.js v20 or newer
  • npm or pnpm

Step-by-step:

  1. Clone the repository: git clone https://github.com/gogabrielordonez/mcp-ragchat
  2. Install dependencies and build: cd mcp-ragchat npm install npm run build
  3. Run the MCP server via your MCP client (see mcp_config for exact command). Ensure you have required environment variables available, such as OPENAI_API_KEY:
    • OPENAI_API_KEY: your OpenAI API key
  4. Test locally using the built-in tools (ragchat_setup, ragchat_test) and then start the chat server (ragchat_serve) and generate a widget (ragchat_widget) as needed.

Additional notes

Tips and notes:

  • The server stores data locally under ~/.mcp-ragchat/domains/ per domain you configure, with vectors.json and config.json tracking documents and settings.
  • Ensure OPENAI_API_KEY (or other provider keys) is available in the environment where the server runs.
  • The widget script is self-contained and does not require external dependencies beyond the served chat endpoint.
  • If you modify content, re-run ragchat_setup to re-seed and re-embed the documents.
  • You can override default LLM and embedding models via LLM_MODEL and EMBEDDING_MODEL environment variables when applicable.
  • For Claude Code integration, provide the absolute path to the built mcp-server.js in the command argument path, as shown in the quick start example.

Related MCP Servers

Sponsor this space

Reach thousands of developers