qarinai
Create unlimited AI chatbot agents for your website — powered by OpenAI-compatible LLMs, RAG, and MCP.
claude mcp add --transport stdio qarinai-qarinai docker compose up -d
How to use
Qarīn.ai is a platform that lets you create unlimited AI chatbot agents for your websites and optionally expose them as MCP (Model Context Protocol) servers. The readme indicates you can run Qarīn.ai via Docker Compose, connect to various LLM providers (OpenAI-compatible APIs, Ollama, llama.cpp, etc.), and import Swagger/OpenAPI specs to auto-generate MCP servers. You can also build vector stores for retrieval-augmented generation and expose those stores or MCP servers to external AI agents. To use the MCP server capability, clone the repository, deploy with Docker Compose, and then use the Qarīn.ai UI to import or create MCP servers from your REST/OpenAPI specs or to connect to your preferred LLM provider. The MCP capability enables you to run server endpoints that other agents can access, enabling integration with your existing tooling and workflows.
How to install
Prerequisites:
- Docker and Docker Compose installed on your system
- git
Installation steps:
- Clone the repository: git clone https://github.com/qarinai/qarinai.git
- Navigate into the project directory: cd qarinai
- Start the services using Docker Compose (this will bring up Qarīn.ai with pre-configured environment variables): docker compose up -d
- Wait for the services to initialize, then open the UI (the default credentials listed are admin/admin) and begin configuring your MCP servers or import Swagger/OpenAPI specs to auto-generate MCP servers.
Additional notes
Notes and tips:
- The project is marked as still under development and not ready for production use, so exercise caution when using it in production environments.
- Environment variables used by the deployment are pre-configured in the Docker Compose setup; you can customize them via a .env file if needed.
- You can import Swagger/OpenAPI specifications to automatically generate MCP servers, and you can expose vector stores or MCP servers to external AI agents.
- If you encounter issues, check Docker status, container logs, and ensure your chosen OpenAI-compatible LLM provider credentials are correctly configured in the UI.
- This documentation assumes a Docker Compose deployment as the primary installation path mentioned in the README.
Related MCP Servers
open-webui
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
AutoDocs
We handle what engineers and IDEs won't: generating and maintaining technical documentation for your codebase, while also providing search with dependency-aware context to help your AI tools understand your codebase and its conventions.
chat-ui
Single-File AI Chatbot UI with Multimodal & MCP Support: An All-in-One HTML File for a Streamlined Chatbot Conversational Interface
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
mcp-frontend
Frontend for MCP (Model Context Protocol) Kit for Go - A Complete MCP solutions for ready to use
mcp-chat-studio
A powerful MCP testing tool with multi-provider LLM support (Ollama, OpenAI, Claude, Gemini). Test, debug, and develop MCP servers with a modern UI.