mcp-ecosystem
MCP Ecosystem: Docker MCP Toolkit, IDE Configs & Presets for AIOS
claude mcp add --transport stdio synkraai-mcp-ecosystem docker run -i mcp-ecosystem-exa \ --env EXA_API_KEY="" \ --env MCP_API_KEY="" \ --env MCP_CONFIG_PATH="servers/exa.json"
How to use
This MCP server collection provides configurable contexts for AIOS-related workflows, including context7 for up-to-date LLM documentation, desktop-commander for file and terminal operations, playwright for browser automation, and exa for AI-powered web search and content extraction. Each server is exposed as an MCP endpoint that can be consumed by your IDEs or tooling that support the MCP protocol. To get started, run the corresponding container (or deploy the server in your preferred environment) and point your MCP client at the server endpoints defined in each server’s JSON configuration. The presets in the repository help you quickly assemble multi-server configurations for development, research, or full-task pipelines.
How to install
Prerequisites:
- Docker installed and running (or your preferred container runtime).
- Access to the repository files (clone or download).
Step-by-step:
- Install dependencies if you’re running locally outside of Docker (optional for native runtimes): ensure you have Docker and your MCP client configured.
- Review the server JSON files under mcp-ecosystem/servers to understand the exact endpoint names and any required API keys.
- Start each MCP server instance:
- For Docker-based runs, use the following example commands (adjust image names as needed): docker run -i mcp-ecosystem-context7 docker run -i mcp-ecosystem-desktop-commander docker run -i mcp-ecosystem-playwright docker run -i mcp-ecosystem-exa
- If you’re using a non-Docker runtime, adapt the commands to launch the respective servers, ensuring they point to the JSON configurations located in servers/ (e.g., servers/context7.json).
- Configure your MCP client (IDE config or CLI) to point to the running server endpoints. Use the environment variables documented in each server config if keys or API keys are required.
Additional notes
Notes and tips:
- The exa server requires an API key (EXA_API_KEY). Set this in the environment or provide via your MCP client configuration.
- Some servers may expose different types (SSE vs stdio). Ensure your MCP client supports the corresponding transport type.
- If you update any server JSON, restart the corresponding container or process to apply changes.
- The repository includes presets (aios-dev, aios-research, aios-full) to compose multi-server configurations quickly; choose one based on your task scope.
- For IDE integrations (Claude Code, Cursor), you can copy the sample configs from ide-configs to your editor’s configuration and merge them with your existing setup.
Related MCP Servers
sandbox
A Model Context Protocol (MCP) server that enables LLMs to run ANY code safely in isolated Docker containers.
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.
dockashell
DockaShell is an MCP server that gives AI agents isolated Docker containers to work in. MCP tools for shell access, file operations, and full audit trail.
local -gateway
Aggregate multiple MCP servers into a single endpoint with web UI, OAuth 2.1, and profile-based tool management
puppeteer
Self-hosted Puppeteer MCP server with remote SSE access, API key authentication, and Docker deployment. Complete tool suite for browser automation via Model Context Protocol.
nmap
MCP server for AI-powered network scanning with Nmap. Port scanning, service detection, OS fingerprinting, and vulnerability scanning for AI agents. By Vorota AI.