JeecgBoot
【AI低代码平台】AI low-code platform empowers enterprises to quickly develop low-code solutions and build AI applications. 助力企业快速实现低代码开发和构建AI应用! AI应用平台涵盖:AI应用、AI模型、AI聊天助手、知识库、AI流程编排、MCP和插件,聊天式业务操作等。 强大代码生成器:实现前后端一键生成,无需手写代码! 显著提升效率节省成本,又不失灵活~
claude mcp add --transport stdio jeecgboot-jeecgboot docker run -i jeecgboot/jeecgboot
How to use
JeecgBoot is an enterprise AI low-code platform with MCP integration that enables quick deployment and management of AI-driven applications. This MCP server entry allows you to launch JeecgBoot via a container image and access its MCP-enabled capabilities, including online form design, AI-driven workflows, data visualization, and multi-model AI interactions. Use the Docker-based one-click start to spin up the platform, then interact with the admin UI to configure projects, roles, data sources, and MCP plugins. The platform integrates AI models, knowledge bases, and workflow design to accelerate low-code development with AI-assisted features.
Once running, you can leverage JeecgBoot’s MCP-enabled tooling to create and manage AI applications, configure MCP plugins, set up multi-tenant access, and define RBAC permissions. The system exposes a web UI and APIs for programmatic access, allowing you to seed data, customize forms, design processes, and monitor AI workflows. Refer to the official docs linked in the README for detailed steps on project setup, security configuration, and integrating external AI models like ChatGPT, DeepSeek, or Ollama through MCP plugins.
How to install
Prerequisites:
- Docker (and optionally Docker Compose) installed on your host
- Internet access to pull the JeecgBoot image
Installation steps:
-
Pull and run JeecgBoot image (Docker): docker run -d --name jeecgboot -p 8080:8080 jeecgboot/jeecgboot
-
Wait for the container to start. Access the UI at: http://localhost:8080
-
Initial setup:
- Create an admin user (default admin/admin or as configured by the image)
- Configure data sources (MySQL/PostgreSQL/etc.) as required by your deployment
- Configure MCP plugins and AI models from the JeecgBoot admin console
-
Optional: For production, consider using Docker Compose or Kubernetes for orchestration and persistent storage. See the official Docker quick start guide referenced in the README for microservice deployment options.
Additional notes
Tips and common considerations:
- The JeecgBoot image may require external database services; ensure DB connectivity and proper migrations.
- Check security settings (RBAC, API keys, and OpenAPI access) before exposing the UI or APIs to the internet.
- When using MCP plugins, verify compatibility with your chosen AI models (ChatGPT, DeepSeek, Ollama, etc.) and manage API keys securely.
- For multi-tenant SaaS deployments, configure the SaaS tenancy model early in the setup to avoid data leakage between tenants.
- Refer to the official docs for UI components, online forms, reports, dashboards, and AI modules to maximize the platform’s capabilities.
Related MCP Servers
ragflow
RAGFlow is a leading open-source Retrieval-Augmented Generation (RAG) engine that fuses cutting-edge RAG with Agent capabilities to create a superior context layer for LLMs
lobehub
The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.
bytebot
Bytebot is a self-hosted AI desktop agent that automates computer tasks through natural language commands, operating within a containerized Linux desktop environment.
Everywhere
Context-aware AI assistant for your desktop. Ready to respond intelligently, seamlessly integrating multiple LLMs and MCP tools.
archestra
Secure cloud-native MCP registry, gateway & orchestrator
tuui
A desktop MCP client designed as a tool unitary utility integration, accelerating AI adoption through the Model Context Protocol (MCP) and enabling cross-vendor LLM API orchestration.