MCP-Airflow-API
🔍Model Context Protocol (MCP) server for Apache Airflow API integration. Provides comprehensive tools for managing Airflow clusters including service operations, configuration management, status monitoring, and request tracking.
claude mcp add --transport stdio call518-mcp-airflow-api uvx --python 3.12 mcp-airflow-api \ --env AIRFLOW_API_VERSION="v2" \ --env AIRFLOW_API_BASE_URL="http://localhost:8080/api" \ --env AIRFLOW_API_PASSWORD="airflow" \ --env AIRFLOW_API_USERNAME="airflow"
How to use
MCP-Airflow-API is an MCP server that translates natural language requests into Apache Airflow REST API calls. It dynamically supports Airflow API versions v1 and v2, loading the appropriate toolset based on the AIRFLOW_API_VERSION environment variable. With this server, you can ask questions like 'Show me the currently running DAGs' or 'List failed tasks in the last hour' and have them executed against your Airflow cluster without manually crafting REST requests. The server exposes an intuitive interface via MCP tooling and can be run in stdio mode or integrated with clients that understand MCP configurations.
To use it, start the MCP server configuration (as shown in mcp_config). Provide your Airflow base URL, and credentials if your Airflow instance requires authentication. The server supports both Airflow 2.x (v1) and Airflow 3.x (v2) toolsets. When running, you can point your MCP client to the server and begin issuing natural language commands such as: "Show me the DAGs with active runs", "Trigger DAG my_dag", or "What are the 5 most recent task failures?". If you’re using the OpenWebUI flow, you can combine the MCP-Airflow-API with a UI to issue prompts and view results in a familiar interface.
The Quickstart guidance in the README shows typical deployment with Docker-Compose for a complete demo environment, including starting the services and accessing API docs. The server also supports multiple Airflow clusters by varying AIRFLOW_API_VERSION and base URLs in separate MCP server configurations.
How to install
Prerequisites:
- Python 3.12 (recommended for development/usage)
- Access to a running Airflow cluster (Airflow REST API enabled)
- Optional: uvx (MCP runtime) or pipx/pip tooling as described below
Method 1 — Direct installation from PyPI (recommended for quick start):
uvx --python 3.12 mcp-airflow-api
Method 2 — Claude-Desktop MCP Client Integration (as shown in README):
{
"mcpServers": {
"mcp-airflow-api": {
"command": "uvx",
"args": ["--python", "3.12", "mcp-airflow-api"],
"env": {
"AIRFLOW_API_VERSION": "v2",
"AIRFLOW_API_BASE_URL": "http://localhost:8080/api",
"AIRFLOW_API_USERNAME": "airflow",
"AIRFLOW_API_PASSWORD": "airflow"
}
}
}
}
Method 3 — Development installation (from source):
git clone https://github.com/call518/MCP-Airflow-API.git
cd MCP-Airflow-API
pip install -e .
# Run in stdio mode
python -m mcp_airflow_api
Notes:
- If you are using a Docker-based workflow, you can also rely on the Docker-compose setup described in the README for a complete demo environment.
- Ensure the Airflow REST API is reachable at AIRFLOW_API_BASE_URL and credentials are valid if your Airflow instance requires authentication.
Additional notes
Tips and common considerations:
- AIRFLOW_API_VERSION controls which toolset is loaded (v1 for Airflow 2.x, v2 for Airflow 3.x). Use the environment variable to switch versions without changing MCP tooling.
- AIRFLOW_API_BASE_URL should point to the Airflow REST API base path (e.g., http://localhost:8080/api).
- When using authentication, you may need to adjust AIRFLOW_API_USERNAME and AIRFLOW_API_PASSWORD or use token-based methods if your Airflow setup requires it.
- For a multi-cluster setup, duplicate the MCP server entry with different AIRFLOW_API_VERSION and BASE_URL values to connect to different Airflow instances.
- If you encounter issues with network access, verify that the Airflow REST API is accessible from the host running the MCP server and that any proxies or firewalls allow the traffic.
Related MCP Servers
bytebot
Bytebot is a self-hosted AI desktop agent that automates computer tasks through natural language commands, operating within a containerized Linux desktop environment.
cursor-talk-to-figma
TalkToFigma: MCP integration between AI Agent (Cursor, Claude Code) and Figma, allowing Agentic AI to communicate with Figma for reading designs and modifying them programmatically.
metorial
Connect any AI model to 600+ integrations; powered by MCP 📡 🚀
nono
Secure, kernel-enforced sandbox CLI and SDKs for AI agents. Capability-based isolation with secure key management, atomic rollback, cryptographic immutable audit chain of provenance. Run your agents in a zero-trust environment.
ios-simulator-skill
An IOS Simulator Skill for ClaudeCode. Use it to optimise Claude's ability to build, run and interact with your apps, without using up any of the available token/context budget.
agenite
🤖 Build powerful AI agents with TypeScript. Agenite makes it easy to create, compose, and control AI agents with first-class support for tools, streaming, and multi-agent architectures. Switch seamlessly between providers like OpenAI, Anthropic, AWS Bedrock, and Ollama.