Get the FREE Ultimate OpenClaw Setup Guide →

linode

A Model Context Protocol (MCP) server that enables AI assistants like Claude to interact with your Linode Cloud Platform environment.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio takashito-linode-mcp-server npx -y @takashito/linode-mcp-server --token YOUR_LINODE_API_TOKEN

How to use

This MCP server acts as a bridge between your AI assistant and Linode cloud resources, enabling you to manage instances, volumes, networking, and other Linode services through natural language. Built on the FastMCP framework, it supports multiple transports (stdio, SSE, and HTTP streaming) so you can connect from Claude Desktop, VSCode Copilot, or other clients. To get started, provide a Linode API token and run the server via npx, which exposes a set of tools categorized by Linode services. You can tailor the available capabilities with the --categories flag to include only the tools you need (for example: instances, volumes, regions). Tools are exposed as commands like list_instances, create_instance, delete_instance, and many more for each service category, allowing you to perform tasks such as listing resources, creating new resources, updating configurations, and performing maintenance operations directly from your AI client.

How to install

Prerequisites:

  • Node.js and npm installed on your machine (recommended latest LTS)
  • A Linode API token with appropriate permissions

Installation steps:

  1. Install and run the MCP server using npx (no global install required):

    npx @takashito/linode-mcp-server --token YOUR_LINODE_API_TOKEN
    
  2. Alternatively, if you want to pin a specific version, install via npm and run from the local package:

    mkdir my-linode-mcp && cd my-linode-mcp
    npm init -y
    npm install @takashito/linode-mcp-server
    npx @takashito/linode-mcp-server --token YOUR_LINODE_API_TOKEN
    
  3. If you plan to connect Claude Desktop or other clients, configure the MCP server entry with your token and desired categories as shown in the examples, or create an environment file with LINODE_API_TOKEN and pass it through the --token option.

  4. Optional: list available categories to tailor tooling:

    npx @takashito/linode-mcp-server --list-categories
    

Additional notes

Tips and tips:

  • Use --categories to limit tools to only what you need (e.g., instances, volumes, regions) to reduce context window usage for AI clients.
  • For GPT-4o clients, pass --categories to avoid context overflow.
  • You can run the server with different transports (stdio, sse, http) and adjust ports/endpoints as needed.
  • If you want to forward a separate API token to a backend client, you can configure an mcp-remote endpoint with an Authorization header and provide the token via environment variables (see examples in the README).
  • Ensure your Linode API token has the necessary permissions for the operations you intend to perform.

Related MCP Servers

Sponsor this space

Reach thousands of developers