tiny_chat
This is an LLM application with chat functionality, featuring chat using RAG, a database, and MCP server capabilities. The UI is designed for Japanese users.
claude mcp add --transport stdio to-aoki-tiny_chat /path/to/tiny_chat/.venv/bin/tiny-chat-mcp \ --env DB_CONFIG="/path/to/tiny_chat/database_config.json"
How to use
Tiny Chat provides a lightweight chat interface backed by a local database, with an MCP server wrapper that enables integration with external tooling (e.g., Claude Desktop). The MCP setup points to the tiny-chat-mcp executable inside a virtual environment and requires a database configuration file via the DB_CONFIG environment variable. This enables other MCP clients to connect to the Tiny Chat service and perform chat operations through the MCP protocol.
To use it, first ensure the Tiny Chat MCP server is started via your MCP runner with the correct environment pointing to your database config. The server exposes its functionality through the tiny-chat-mcp command, and your client can interact with it as part of a larger MCP ecosystem. For convenience, the repository also includes a separate OpenAI Chat API RAG server that can be run with tiny-chat-api for retrieval-augmented generation tasks. You can connect to the Tiny Chat MCP service from clients that support the MCP schema and supply the appropriate environment configuration (DB_CONFIG) so the server can access its persistent storage.
How to install
Prerequisites:\n- Python 3.10 or later (as recommended by the project)\n- pip (usually included with Python)\n- A Python virtual environment (recommended)\n\nInstallation steps:\n1) Create and activate a virtual environment (recommended):\n\n python -m venv .venv\n # On Windows: ..venv\Scripts\activate\n # On macOS/Linux: source .venv/bin/activate\n\n2) Install development requirements:\n\n pip install -r requirements.txt\n\n3) Build the package (if you are using the published package workflow):\n\n pip install build\n python -m build\n\n4) Install the built package (from the dist/ folder):\n\n pip install dist/*.whl\n\n5) Ensure the Tiny Chat MCP executable is available inside the virtual environment at: /path/to/tiny_chat/.venv/bin/tiny-chat-mcp\n\n6) Prepare your database configuration file (database_config.json) and reference it via the DB_CONFIG environment variable in your MCP configuration.\n
Additional notes
Notes and tips:\n- The MCP config expects an environment variable DB_CONFIG pointing to your database configuration file. Ensure the path is correct and the file is accessible by the process running the MCP server.\n- If you encounter issues starting the MCP server, verify that the virtual environment is activated and that the tiny-chat-mcp executable has execute permissions.\n- The Tiny Chat web interface uses Streamlit in development; for production, you typically rely on the MCP wrapper to orchestrate interactions rather than launching Streamlit directly.\n- When using the OpenAI Chat API RAG server (tiny-chat-api), you can query the chat endpoint at localhost:8080/v1/chat/completions with a payload that specifies a model aligned to your Qdrant collection.\n- If you switch environments or paths, update the DB_CONFIG path accordingly in the MCP config to avoid connection errors.
Related MCP Servers
chunkhound
Local first codebase intelligence
VectorCode
A code repository indexing tool to supercharge your LLM experience.
mcp-pinecone
Model Context Protocol server to allow for reading and writing from Pinecone. Rudimentary RAG
nextcloud
Nextcloud MCP Server
Archive-Agent
Find your files with natural language and ask questions.
codebase-RAG
A Retrieval-Augmented Generation (RAG) Model-Controller-Provider (MCP) server designed to assist AI agents and developers in understanding and navigating codebases.. It supports incremental indexing and multi-language parsing, enabling LLMs to understand and interact with code.