mcp -dagster
MCP server from kyryl-opens-ml/mcp-server-dagster
claude mcp add --transport stdio kyryl-opens-ml-mcp-server-dagster uvx mcp-server-dagster \ --env TRANSPORT="SSE" \ --env GRAPHQL_ENDPOINT="http://localhost:3000/graphql"
How to use
This MCP server provides a Dagster-focused interface for AI agents to discover and interact with Dagster repositories, jobs, assets, and runs. It exposes tools such as list_repositories, list_jobs, list_assets, recent_runs, get_run_info, launch_run, materialize_asset, terminate_run, and get_asset_info. Use these tools to explore your Dagster environment, query pipeline assets, monitor run status, and trigger or materialize assets from an LLM-driven workflow. The server connects to a Dagster GraphQL endpoint over Server-Sent Events (SSE) for streaming updates, enabling real-time feedback during long-running operations.
To use the tools, point your MCP-enabled agent at the Dagster MCP server and issue tool commands like: list_repositories to enumerate available repositories, list_jobs with a repository argument to inspect pipelines, and recent_runs to fetch recent activity. For operational tasks, use launch_run to start a pipeline, then monitor progress via get_run_info or recent_runs. Asset-centric actions like list_assets and materialize_asset let you inspect and materialize Dagster assets, while terminate_run provides a way to stop in-flight executions. The integration is designed for developers and data engineers looking to embed Dagster orchestration capabilities into LLM-assisted workflows.
How to install
Prerequisites:
- Python 3.8+ and a working Python environment
- Access to a Dagster instance (GraphQL endpoint) or a local Dagster dev server
Installation (recommended):
-
Install the MCP Dagster server package from PyPI (example package name):
pipx install mcp-server-dagster
or, if you prefer pip in a virtual environment:
python -m pip install mcp-server-dagster
-
Verify installation:
python -m pip show mcp-server-dagster
-
Run the MCP server (using uvx portable runner):
uvx mcp-server-dagster
-
If you are not using uvx, you can run via a direct Python entry point or module, depending on distribution specifics:
python -m mcp_server_dagster
-
Once running, ensure the Dagster GraphQL endpoint is reachable (default: http://localhost:3000/graphql) and that the transport is configured to SSE as described in the configuration.
Note: If you prefer a Docker-based workflow, consult the package documentation for a containerized run option, and adapt the commands accordingly.
Additional notes
- The MCP server assumes Dagster is accessible via the GraphQL endpoint at http://localhost:3000/graphql. Update GRAPHQL_ENDPOINT in the environment if your Dagster instance is elsewhere.
- Transport is SSE by default for streaming updates; ensure your client supports SSE.
- If you run into authentication issues with Dagster, provide the necessary credentials via environment variables or proper Dagster config.
- The available tools map to common Dagster operations; consult Dagster docs for specifics on repository names, job names, and asset identifiers to use with list_repositories, list_jobs, list_assets, and asset-related actions.
- For production deployments, consider configuring proper security, rate limits, and access controls around the MCP endpoints.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP