ai-suite
AI-Suite - n8n, Open WebUI, OpenCode, Llama.cpp/Ollama, Flowise, Langfuse, MCP Gateway and more!
claude mcp add --transport stdio trevorsandy-ai-suite docker compose up -d
How to use
AI-Suite provides a self-hosted, end-to-end local AI workflow platform built around a pre-configured Docker Compose setup. It bundles a fully featured local n8n automation platform with Open WebUI for private model interaction, OpenCode for in-terminal code assistance, Ollama and LLaMA.cpp for local LLMs, and a suite of data services (PostgreSQL, Supabase, Redis/Valkey, Qdrant, ClickHouse, MinIO, Neo4j, Flowise, and more) to enable end-to-end AI agent pipelines. The MCP server entry here assumes running the Docker Compose stack via a single command, which brings up all services in the background for you to begin building AI workflows. Once running, you can access n8n for orchestration, the Open WebUI for chat-like interactions with local models, and Flowise to visually assemble agent workflows that leverage the integrated tools and databases.
How to install
Prerequisites:
- Docker and Docker Compose installed on your host (Docker Desktop on Windows/macOS includes Compose).
- Git to clone the repository.
Step 1: Clone the repository
git clone https://github.com/trevorsandy/ai-suite.git
cd ai-suite
Step 2: Install and configure environment variables
- Copy the example env file and edit as needed (these cover credentials for n8n, Supabase, PostgreSQL, and other services).
cp .env.example .env
- Open .env and update secrets and passwords as required by your environment.
Step 3: Start the stack
docker compose up -d
Step 4: Verify and access services
- Wait a few minutes for containers to initialize.
- Access n8n at http://localhost:5678 (default internal port, may be proxied by your docker setup).
- Open WebUI at http://localhost:PORT for private model interactions (port depends on your config).
Optional: If you need to stop the stack later, run
docker compose down
Additional notes
Tips:
- Ensure your host has at least 32 GB RAM for smooth operation, with 20 GB+ free disk space.
- If you are behind a proxy, configure Docker and your .env with appropriate proxy settings.
- The included Open WebUI, Ollama, and LLaMA.cpp components can consume substantial RAM; adjust resource limits in your Docker setup if needed.
- If Docker Compose reports version issues, confirm that your Docker installation includes the docker compose subcommand (or install a standalone compatible version).
- Manage credentials securely; never commit .env files with real secrets to version control.
- For troubleshooting, check container logs with
docker compose logs -fand verify that dependent services (DBs, Redis, etc.) are healthy before relying on downstream agents.
Related MCP Servers
gpt-researcher
An autonomous agent that conducts deep research on any data using any LLM providers.
Scrapling
🕷️ An adaptive Web Scraping framework that handles everything from a single request to a full-scale crawl!
trigger.dev
Trigger.dev – build and deploy fully‑managed AI agents and workflows
better-chatbot
Just a Better Chatbot. Powered by Agent & MCP & Workflows.
mikrotik
MCP server for Mikrotik
mcp
🤖 Taskade MCP · Official MCP server and OpenAPI to MCP codegen. Build AI agent tools from any OpenAPI API and connect to Claude, Cursor, and more.