Get the FREE Ultimate OpenClaw Setup Guide →

agents

AI agent tooling for data engineering workflows.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio astronomer-agents uvx astro-airflow-mcp --transport stdio \
  --env AIRFLOW_API_URL="https://your-airflow-api.example.com" \
  --env AIRFLOW_PASSWORD="your-password" \
  --env AIRFLOW_USERNAME="your-username"

How to use

This MCP server exposes the Airflow integration from Astronomer as an MCP-compatible service. It runs the astro-airflow-mcp component via uvx, enabling you to manage Airflow REST API interactions, DAG management, triggering tasks, and viewing logs through your MCP-enabled client (e.g., Claude Desktop, Cursor, or other MCP clients). The included CLI tool (af) is part of the Airflow MCP package and can be used to interact with Airflow from the terminal when running locally or within an MCP-enabled environment. To begin, connect to the Airflow MCP server using your MCP client, and then use the available Airflow-oriented prompts, actions, and skills to list DAGs, trigger runs, monitor health, or fetch task logs. The server is designed to work with either self-hosted Airflow installations (via REST API) or Airflow deployments exposed through a public API, provided you configure AIRFLOW_API_URL and credentials accordingly. Cursor users can install the MCP server configuration and additional hooks to get skill suggestions or session management during Airflow tasks.

How to install

Prerequisites:

  • Python environment with uvx support or the tool you use to run MCP servers
  • Access to an Airflow REST API (self-hosted or cloud) with valid credentials
  • Network access from your environment to the Airflow API

Step-by-step installation:

  1. Install the Airflow MCP server via uvx (or your preferred MCP runner): uvx astro-airflow-mcp --transport stdio

  2. Configure required environment variables for Airflow access (recommended):

    • AIRFLOW_API_URL: URL to your Airflow REST API
    • AIRFLOW_USERNAME: API user name
    • AIRFLOW_PASSWORD: API password or token
  3. If integrating with an MCP client, prepare the mcp.json configuration (example available in mcp_config): Save the following to your MCP client configuration or equivalent: { "mcpServers": { "airflow": { "command": "uvx", "args": ["astro-airflow-mcp", "--transport", "stdio"], "env": { "AIRFLOW_API_URL": "https://your-airflow-api.example.com", "AIRFLOW_USERNAME": "your-username", "AIRFLOW_PASSWORD": "your-password" } } } }

  4. Start or connect your MCP client and point it to the configured server. For Cursor or other clients, use the provided deep links or install steps from the README to enable the MCP server connection.

Additional notes

Tips and caveats:

  • Ensure AIRFLOW_API_URL points to a reachable Airflow REST API (Airflow 2.x/3.x compatible).
  • If your Airflow instance uses Basic Auth, provide AIRFLOW_USERNAME and AIRFLOW_PASSWORD; for token-based auth, adjust the env vars or how your Airflow API handles tokens.
  • Some MCP clients may require a specific transport (stdio in this example); adjust the args if your client uses sockets or another transport.
  • If you upgrade Airflow or the astro-airflow-mcp package, verify compatibility with your MCP client version.
  • The Airflow MCP server integrates with the Airflow REST API for DAG management, triggering, and logs; ensure the API is enabled and accessible from your network.
  • For local testing, you can run uvx astro-airflow-mcp --transport stdio in a shell and connect your MCP client to the same process using stdio transport.

Related MCP Servers

Sponsor this space

Reach thousands of developers