autoteam
Orchestrate AI agents with YAML-driven workflows via universal Model Context Protocol (MCP)
claude mcp add --transport stdio diazoxide-autoteam docker run -i diazoxide/autoteam
How to use
AutoTeam is an MCP hub that orchestrates AI agents across various services and platforms. It enables you to connect AI agents (such as Claude Code, Gemini CLI, Qwen Code, and other MCP-enabled tools) to external systems through a centralized MCP server. By running the AutoTeam MCP server in a container, you expose a universal integration point where agents can collaborate, share state, and execute multi-service workflows without bespoke integrations. To use it, install the AutoTeam MCP server (via Docker in this configuration), then configure MCP endpoints, agents, and service MCPs in your MCP network. Once running, agents can be assigned to tasks that involve multiple platforms (GitHub, Slack, databases, CMSs, etc.) and AutoTeam will coordinate parallel execution, routing work to the appropriate MCPs and agents as defined by your workflows.
How to install
Prerequisites:
- Docker installed and running on your host
- Network access to pull Docker images from Docker Hub
Installation steps:
-
Pull and run the AutoTeam MCP server as a container: docker run -i diazoxide/autoteam
-
If you want to customize configuration, create a local MCP network configuration file and mount it into the container as needed (depending on the image's supported config method).
-
Verify the server is up by checking container logs: docker ps docker logs <container_id>
-
Integrate with your MCP ecosystem by registering MCP endpoints for GitHub MCP, Slack MCP, Database MCP, etc., using the AutoTeam documentation under docs/configuration.md for guidance.
Note: If you prefer to build locally, you would need Go tooling and build the project from source, then run the resulting binary. The Docker approach is recommended for simplicity and isolation.
Additional notes
Tips and considerations:
- Environment variables: If you need to customize behavior (endpoints, tokens, timeouts), use environment variables supported by the image (refer to the image docs). The default setup assumes standard MCP endpoints are reachable from the container.
- Networking: Ensure the container can reach external MCP services (GitHub MCP, Slack MCP, etc.) and any target platforms.
- Scaling: AutoTeam is designed to orchestrate across multiple agents; consider running additional containers or a cluster setup to handle higher workloads.
- Troubleshooting: Check container logs for errors related to agent registration, MCP discovery, or network access. Look for missing token errors or refused connections, which typically indicate misconfigured endpoints or permissions.
- Security: Use least-privilege tokens for MCP integrations and secure your container runtime following best practices.
Related MCP Servers
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
SearChat
Search + Chat = SearChat(AI Chat with Search), Support OpenAI/Anthropic/VertexAI/Gemini, DeepResearch, SearXNG, Docker. AI对话式搜索引擎,支持DeepResearch, 支持OpenAI/Anthropic/VertexAI/Gemini接口、聚合搜索引擎SearXNG,支持Docker一键部署。
better-chatbot
Just a Better Chatbot. Powered by Agent & MCP & Workflows.
AutoDocs
We handle what engineers and IDEs won't: generating and maintaining technical documentation for your codebase, while also providing search with dependency-aware context to help your AI tools understand your codebase and its conventions.
chatgpt-copilot
ChatGPT Copilot Extension for Visual Studio Code
openapi
OpenAPI definitions, converters and LLM function calling schema composer.