core
AI agent microservice
claude mcp add --transport stdio cheshire-cat-ai-core docker run --rm -it -p 1865:80 ghcr.io/cheshire-cat-ai/core:latest
How to use
Cheshire Cat AI is a framework that exposes an AI agent as a microservice. The Core server is provided as a Docker image and is designed to be API-first, with WebSocket-based chat, a REST admin REST API, built-in retrieval-augmented generation (RAG) via Qdrant, and extendable via plugins. It supports multiuser scenarios with granular permissions and can be accessed locally at the mapped port. Once running, you can open the admin panel to chat with the agent, explore docs, and use the REST API endpoints to interact programmatically. The server is containerized, which makes deployment consistent across environments, and the project emphasizes an extensible plugin ecosystem and form-based conversational flows.
To use: start the container, then connect to http://localhost:1865/admin for the admin panel and http://localhost:1865/docs for the API documentation. The WebSocket channel can be used for real-time chat experiences, while the REST API provides programmatic access for integrations. If you need to customize behavior, you can develop plugins and tools that extend the agent’s capabilities, or configure the admin panel to manage agents and permissions.
How to install
Prerequisites:
- Docker installed on your machine (Docker Desktop for Windows/macOS, or Docker Engine for Linux).
- Optional: docker-compose if you prefer a compose-based setup.
Install and run (docker):
- Pull and run the latest image:
docker run --rm -it -p 1865:80 ghcr.io/cheshire-cat-ai/core:latest
- Access the admin interface and docs:
- Admin: http://localhost:1865/admin
- API docs: http://localhost:1865/docs
Optional with Docker Compose (advanced, with volumes): Create a docker-compose.yml like:
version: '3.8'
services:
cheshire-core:
image: ghcr.io/cheshire-cat-ai/core:latest
ports:
- "1865:80"
restart: unless-stopped
volumes:
- ./data:/data
Then start with:
docker-compose up -d
Notes:
- The container exposes the admin UI at port 1865 and the API documentation at /docs.
- If you need persistent storage or custom configurations, mount volumes as shown above.
Additional notes
Tips and common issues:
- Ensure Docker is running and you have network access to pull the ghcr.io image.
- If the admin UI does not load, check docker logs for the container to diagnose startup errors.
- The server maps port 1865 to the container’s 80 port; make sure that port is not used by another service.
- For production deployments, consider using docker-compose with a volume for data persistence and TLS/reverse proxy in front of the service.
- Refer to the official docs and Discord for community support if you hit integration or plugin issues.
- If you need to customize the agent behavior, explore plugins, tools, and conversational forms supported by the framework.
Related MCP Servers
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
aci
ACI.dev is the open source tool-calling platform that hooks up 600+ tools into any agentic IDE or custom AI agent through direct function calling or a unified MCP server. The birthplace of VibeOps.
argo
ARGO is an open-source AI Agent platform that brings Local Manus to your desktop. With one-click model downloads, seamless closed LLM integration, and offline-first RAG knowledge bases, ARGO becomes a DeepResearch powerhouse for autonomous thinking, task planning, and 100% of your data stays locally. Support Win/Mac/Docker.
lihil
2X faster ASGI web framework for python, offering high-level development, low-level performance.
mem0
✨ mem0 MCP Server: A memory system using mem0 for AI applications with model context protocl (MCP) integration. Enables long-term memory for AI agents as a drop-in MCP server.
quarkus-workshop-langchain4j
Quarkus Langchain4J Workshop