Risuai
Make your own story. User-friendly software for LLM roleplaying
claude mcp add --transport stdio kwaroran-risuai docker compose -f docker-compose.yml up -d \ --env DOCKER_HOST="optional Docker host (e.g., tcp://your-docker-host:2375)"
How to use
Risuai is a cross‑platform AI chat application with a rich feature set designed to run via Docker Compose. It supports multiple AI providers, such as OpenAI and other compatible backends, and offers capabilities such as emotion imagery, group chats, plugins, regex-based output manipulation, translation, lorebooks for memory, themes, and TTS. After starting the service with Docker Compose, you can access the web UI to interact with characters, manage memory, customize prompts, and configure providers and plugins. The deployment is described in the repository’s Docker setup, and the app exposes a web interface where you can create chats with multiple characters, enable memory features, and attach assets to chats. The system is designed to be web-first with a friendly UI, making it straightforward to experiment with different AI providers, memory modes, and prompting strategies in a single place.
How to install
Prerequisites:
- Docker and Docker Compose installed on your machine
- Git (optional, for cloning the repo)
Installation steps:
-
Ensure Docker is running on your system.
-
Use the Docker Compose method shown in the project README to start Risuai:
curl -L https://raw.githubusercontent.com/kwaroran/Risuai/refs/heads/main/docker-compose.yml | docker compose -f - up -d
-
Open your browser and navigate to the deployment URL. If you followed the default Docker setup, the web UI will be accessible at http://localhost:6001. You can then configure providers, memories, themes, and other features through the UI.
Optional: If you prefer to run the containers manually, you can adapt the docker-compose.yml to your environment and run docker compose up -d in the directory containing the file.
Additional notes
Tips and notes:
- The Docker setup relies on a docker-compose.yml present in the repository; read it to understand service names, ports, and volumes.
- The default access URL is http://localhost:6001, but this may vary if you customize ports in docker-compose.yml or environment variables.
- If you encounter port conflicts, adjust the port mappings in the docker-compose file or set environment variables accordingly.
- The app supports multiple AI providers; check the providers list in the UI to enable/disable specific backends and supply API keys as needed.
- For long-running conversations, consider memory configurations (e.g., HypaMemory/SupaMemory options) to manage context windows.
- If using a remote Docker host, ensure the DOCKER_HOST environment variable is properly set and that the host accepts remote API connections securely.
Related MCP Servers
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
Everywhere
Context-aware AI assistant for your desktop. Ready to respond intelligently, seamlessly integrating multiple LLMs and MCP tools.
better-chatbot
Just a Better Chatbot. Powered by Agent & MCP & Workflows.
openapi
OpenAPI definitions, converters and LLM function calling schema composer.
robot_MCP
A simple MCP server for the SO-ARM100 control
mcp-chat-studio
A powerful MCP testing tool with multi-provider LLM support (Ollama, OpenAI, Claude, Gemini). Test, debug, and develop MCP servers with a modern UI.