gx
A Python server that exposes Great Expectations data quality tools to LLM agents via the Model Context Protocol (MCP).
claude mcp add --transport stdio davidf9999-gx-mcp-server uv run python -m gx_mcp_server
How to use
The Great Expectations MCP Server exposes core Great Expectations data-quality capabilities as MCP tools that can be consumed by LLM agents. It allows you to load datasets from files, URLs, or inline data, define and modify ExpectationSuites, and run validations to obtain detailed results. The server supports multiple transport modes, including STDIO for AI clients, HTTP for web or Claude/Claude-like clients, and an Inspector GUI for interactive exploration. You can enable optional authentication (Basic or Bearer) and configure rate limiting and origins to secure access. Typical workflows include loading a dataset, creating or updating an ExpectationSuite, running a validation, and exporting structured results for downstream decision making.
How to install
Prerequisites:
- Python 3.8+ with venv support
- Optional: Docker if you prefer containerized usage
- Optional: Node tooling or Claude CLI if you plan to integrate MCP clients, though this server is Python-based
Installation steps (Python):
-
Create and activate a virtual environment python -m venv .venv
Windows
..venv\Scripts\activate
macOS/Linux
source .venv/bin/activate
-
Install the package from PyPI pip install gx-mcp-server
-
Run the server in STDIO or HTTP modes as needed (examples below)
STDIO (for AI clients via uv/CLI):
uv run python -m gx_mcp_server
HTTP (for web clients or MCP tooling):
uv run python -m gx_mcp_server --http
If you prefer Docker: docker pull davidf9999/gx-mcp-server:latest docker run -d -p 8000:8000 --name gx-mcp-server davidf9999/gx-mcp-server:latest
Configuration and usage details are described in the README under Installation & Usage.
Additional notes
Tips and common considerations:
- Authentication is off by default. For production, consider enabling Basic or Bearer authentication and manage credentials via environment variables (e.g., MCP_SERVER_USER, MCP_SERVER_PASSWORD).
- When using HTTP mode, you can enable rate limiting, allowed origins, and optional tracing via OpenTelemetry exporters.
- The server supports loading data from files, URLs, or inline CSV/TSV formats, and can connect to Snowflake or BigQuery data sources via URIs.
- If you run in Docker, you may want to mount data volumes and set environment variables for credentials.
- For MCP client configuration, you can connect via STDIO (uv run ...), HTTP (http://host:port/mcp/), or Inspector GUI depending on your workflow.
- Troubleshooting health and connectivity typically involves checking the server port, firewall rules, and ensuring the server process is running with the expected mode (stdio vs http).
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP