Get the FREE Ultimate OpenClaw Setup Guide →

qarinai

Create unlimited AI chatbot agents for your website — powered by OpenAI-compatible LLMs, RAG, and MCP.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio qarinai-qarinai docker compose up -d

How to use

Qarīn.ai is a platform that lets you create unlimited AI chatbot agents for your websites and optionally expose them as MCP (Model Context Protocol) servers. The readme indicates you can run Qarīn.ai via Docker Compose, connect to various LLM providers (OpenAI-compatible APIs, Ollama, llama.cpp, etc.), and import Swagger/OpenAPI specs to auto-generate MCP servers. You can also build vector stores for retrieval-augmented generation and expose those stores or MCP servers to external AI agents. To use the MCP server capability, clone the repository, deploy with Docker Compose, and then use the Qarīn.ai UI to import or create MCP servers from your REST/OpenAPI specs or to connect to your preferred LLM provider. The MCP capability enables you to run server endpoints that other agents can access, enabling integration with your existing tooling and workflows.

How to install

Prerequisites:

  • Docker and Docker Compose installed on your system
  • git

Installation steps:

  1. Clone the repository: git clone https://github.com/qarinai/qarinai.git
  2. Navigate into the project directory: cd qarinai
  3. Start the services using Docker Compose (this will bring up Qarīn.ai with pre-configured environment variables): docker compose up -d
  4. Wait for the services to initialize, then open the UI (the default credentials listed are admin/admin) and begin configuring your MCP servers or import Swagger/OpenAPI specs to auto-generate MCP servers.

Additional notes

Notes and tips:

  • The project is marked as still under development and not ready for production use, so exercise caution when using it in production environments.
  • Environment variables used by the deployment are pre-configured in the Docker Compose setup; you can customize them via a .env file if needed.
  • You can import Swagger/OpenAPI specifications to automatically generate MCP servers, and you can expose vector stores or MCP servers to external AI agents.
  • If you encounter issues, check Docker status, container logs, and ensure your chosen OpenAI-compatible LLM provider credentials are correctly configured in the UI.
  • This documentation assumes a Docker Compose deployment as the primary installation path mentioned in the README.

Related MCP Servers

Sponsor this space

Reach thousands of developers