astro-airflow
MCP server for Apache Airflow instances. Runs standalone or as an Airflow plugin.
claude mcp add --transport stdio astronomer-astro-airflow-mcp uvx astro-airflow-mcp --transport stdio \ --env AIRFLOW_API_URL="Airflow webserver URL (e.g., https://your-airflow.example.com)" \ --env AIRFLOW_PASSWORD="Password for Airflow (if not using token-based auth)" \ --env AIRFLOW_USERNAME="Username for Airflow (or OAuth2 username if using token exchange)" \ --env AIRFLOW_AUTH_TOKEN="Bearer token for authentication (alternative to username/password)"
How to use
The astro-airflow MCP server exposes a suite of MCP tools that let AI assistants query and manage Airflow resources through a unified MCP interface. It supports core operations like listing DAGs, retrieving DAG details and source, triggering DAG runs, and inspecting task instances, as well as management of pools, variables, connections, assets, plugins, providers, and Airflow configuration. In addition, there are consolidated agent tools such as explore_dag, diagnose_dag_run, and get_system_health that provide deeper insights and health checks across DAGs and the Airflow environment. You can access these capabilities via standard MCP clients (Claude Code, Gemini CLI, Codex CLI, or cursor-based/manual JSON) or by connecting to the HTTP MCP endpoint if you enable HTTP mode.
How to install
Prerequisites:
- Python 3.10+ (for environment where you run optional tooling or scripts)
- Access to PyPI (no local installation required for uvx) or a Python environment to run uvx via pipx if you prefer a local setup
Install/Run (no installation required for uvx):
- Ensure Python and pip are installed and available in your PATH.
- Run the MCP server via uvx directly from PyPI (no installation required): uvx astro-airflow-mcp --transport stdio
Optional (for environments that prefer a local installation): 3) Install via pipx (optional but convenient): pipx install astro-airflow-mcp pipx run astro-airflow-mcp --transport stdio
Configuration tips:
- If you are connecting to a remote Airflow instance, provide the Airflow URL and credentials via environment variables or CLI options (see mcp_config env section).
- For manual JSON/Cursor configurations, use the same command and transport flags as shown above.
Additional notes
Notes and tips:
- The server defaults to HTTP mode when not overridden; explicitly use --transport stdio for MCP tooling workflows.
- When using authentication, you can choose between username/password (with OAuth2 token exchange for Airflow 3.x) or a Bearer token. Set AIRFLOW_API_URL and one of the credential methods accordingly.
- The MCP provides a broad set of tools and resources. If you’re integrating with specific clients (Claude, Gemini, Codex, or manual JSON), you can copy the shown command structures into your client configuration.
- For Airflow 2.x vs 3.x compatibility, the MCP includes adapters to detect versions and adjust responses automatically. Ensure your Airflow instance URL is reachable from the MCP host.
- If you encounter issues with connectivity, verify network access, firewall rules, and that the Airflow webserver is reachable at the URL you provide.
Related MCP Servers
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
nerve
The Simple Agent Development Kit.
agents
AI agent tooling for data engineering workflows.
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
mcpx-py
Python client library for https://mcp.run - call portable & secure tools for your AI Agents and Apps