Get the FREE Ultimate OpenClaw Setup Guide →

mcp

Octopus Deploy Official MCP Server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio octopusdeploy-mcp-server npx -y @octopusdeploy/mcp-server --server-url https://your-octopus.com --api-key YOUR_API_KEY

How to use

This MCP server exposes a suite of Octopus Deploy tools to an AI-enabled client via the Model Context Protocol. It acts as a bridge between your AI assistant and your Octopus Deploy instance, enabling read-only discovery, investigation, and, when enabled, write operations such as creating releases or deployments. The server supports a range of toolsets (core, projects, deployments, releases, tasks, tenants, Kubernetes, machines, certificates, accounts) and can be run in read-only mode by default for safety. You can selectively enable toolsets to tailor what your AI assistant can do within Octopus. Additionally, there are URL-based tools that let you resolve deployments and tasks from their public URLs, returning structured context and logs to facilitate troubleshooting.

To use it, configure the MCP server as a stdio-based process (via Node’s npx in this project) or run it through Docker as shown in the installation guide. Your AI client will then communicate with the MCP server using the standard mcpServers configuration, supplying the Octopus Server URL and API key (or leveraging environment variables if you choose to provide them). The resulting capabilities let the AI fetch deployment details, examine tasks, manage releases, and query environments, projects, and deployments in a consistent, protocol-driven manner.

How to install

Prerequisites:

  • Access to an Octopus Deploy instance with an API key that has appropriate permissions for the desired operations
  • Node.js installed (for npx usage) or Docker (for containerized运行)
  • Basic familiarity with MCP configuration in your AI client

Option 1: Install and run via Docker

  1. Install Docker on your machine
  2. Run the MCP server with environment variables or CLI arguments: docker run -i --rm -e OCTOPUS_API_KEY=your-key -e OCTOPUS_SERVER_URL=https://your-octopus.com octopusdeploy/mcp-server or docker run -i --rm octopusdeploy/mcp-server --server-url https://your-octopus.com --api-key YOUR_API_KEY

Option 2: Install and run via Node (npx) Prerequisites: Node.js >= v20.0.0 and npm

  1. Ensure you have access to your Octopus Deploy instance and an API key
  2. Run via npx with required arguments (adjust values to your environment): npx -y @octopusdeploy/mcp-server --server-url https://your-octopus.com --api-key YOUR_API_KEY

Option 3: Prepare a configuration file for your AI client Create a configuration snippet like:

{
  "mcpServers": {
    "octopusdeploy": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "@octopusdeploy/mcp-server", "--server-url", "https://your-octopus.com", "--api-key", "YOUR_API_KEY"]
    }
  }
}

Replace server-url and api-key with your values or rely on environment variables as described in the documentation.

Additional notes

Tips and common notes:

  • By default, the MCP Server runs in read-only mode to prevent unintended changes. To enable write operations (like creating releases or deploying), you must explicitly disable read-only mode using --no-read-only and ensure your API key has appropriate permissions.
  • If you’re running in Docker on Apple Silicon or ARM platforms, you may encounter platform-related flags (e.g., --platform linux/amd64) in some configurations; watch for compatibility notes in the docs.
  • Toolsets determine which categories of Octopus operations are exposed. If you’re unsure what you need, start with core and projects/tools and expand as necessary.
  • When using write-enabled tools, use the principle of least privilege for API keys. Never reuse highly privileged keys in uncontrolled environments.
  • The MCP server supports URL-based tools to quickly fetch deployment or task context from Octopus URLs, which can streamline troubleshooting workflows for AI assistants.

Related MCP Servers

Sponsor this space

Reach thousands of developers