Get the FREE Ultimate OpenClaw Setup Guide →

station

Station is our open-source runtime that lets teams deploy agents on their own infrastructure with full control.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio cloudshipai-station stn stdio \
  --env OTEL_EXPORTER_OTLP_ENDPOINT="http://localhost:4318"

How to use

Station is an AI agent orchestration platform that lets you run and coordinate multiple AI agents as a team. It provides a command-line interface (stn) and a web UI for configuring agents, enabling a local MCP (Model Context Protocol) workflow, and observing runs with Jaeger traces. With Station you can connect from editors and tools via MCP clients, deploy a Git-backed workspace, and manage agent orchestration through a unified interface. The platform supports built-in evaluation, observability, and a scripted onboarding experience to help teams prototype and scale agent workflows quickly.

To use Station, install Station first, initialize with your preferred AI provider, and start Jaeger for tracing. Then connect your MCP client (for example Claude Code CLI, OpenCode, Cursor, or Claude Desktop) by adding Station as an MCP server in your editor or project configuration. You can discover and invoke the 41 MCP tools available in your AI assistant, ranging from agent creation and orchestration to templates, evaluation, and deployment commands. The workflow emphasizes Git-backed configurations and reproducible agent runs, with a web UI at http://localhost:8585 and full observability via Jaeger at http://localhost:16686.

Once connected, you can trigger interactions like listing available Station MCP tools, creating multi-agent hierarchies, and inspecting run details. Station also ships with an interactive onboarding guide to walk you through creating a Hello World agent, using faker data for safe development, and building a multi-agent incident coordinator. This enables you to experiment locally, share configurations via Git, and iterate on agent orchestration workflows effortlessly.

How to install

Prerequisites:

  • Docker (for Jaeger and optional local deployments)
  • A supported AI provider: CloudShip AI, OpenAI, Google Gemini, or Anthropic
  • A shell with curl and Git

Installation steps:

  1. Install Station (via provided installer script):
curl -fsSL https://raw.githubusercontent.com/cloudshipai/station/main/install.sh | bash
  1. Initialize Station with a provider (choose one):
  • CloudShip AI (recommended):
# Set your CloudShip key
export CLOUDSHIPAI_REGISTRATION_KEY="csk-..."
# or: export STN_CLOUDSHIP_KEY="csk-..."
stn init --provider cloudshipai --ship
  • OpenAI API key:
export OPENAI_API_KEY="sk-..."
stn init --provider openai --ship
  • Google Gemini:
export GEMINI_API_KEY="..."
stn init --provider gemini --ship
  1. Start Jaeger for tracing (optional but recommended):
stn jaeger up

This will expose the Jaeger UI at http://localhost:16686.

  1. Connect an MCP client into Station:
  • For Claude Code CLI or other editors, follow the editor-specific steps in the Station docs to add the station MCP server and configure OTEL_ENDPOINT if desired.
  • A typical MCP client configuration uses the station MCP server with the command and arguments to start stdio-based communication, for example:
{
  "mcp": {
    "station": {
      "enabled": true,
      "type": "local",
      "command": ["stn", "stdio"],
      "environment": {
        "OTEL_EXPORTER_OTLP_ENDPOINT": "http://localhost:4318"
      }
    }
  }
}
  1. Verify the setup:
  • Ensure the editor or tool can list Station MCP tools and begin interactions like creating agents and workflows.
  • Open the Station web UI at http://localhost:8585 to review configuration and agent dashboards.

Additional notes

Tips and common issues:

  • If Jaeger UI is not visible, make sure Docker is running and Jaeger services are started with stn jaeger up.
  • The OTEL_EXPORTER_OTLP_ENDPOINT environment variable enables tracing data export; adjust if you run a different collector endpoint.
  • Station supports Git-backed workflows; commit your agent configurations to your repository to track changes over time.
  • Ensure your chosen AI provider keys are valid and properly exported in your environment before running stn init.
  • The MCP client configuration examples show local development usage; adjust paths, endpoints, and config files for your production environment.
  • If you encounter authentication or model access issues, verify provider keys and model availability for your selected provider.

Related MCP Servers

Sponsor this space

Reach thousands of developers