dataproduct
A Model Context Protocol (MCP) server for discovering data products and requesting access in Data Mesh Manager, and executing queries on the data platform to access business data.
claude mcp add --transport stdio entropy-data-dataproduct-mcp uvx dataproduct_mcp \ --env SNOWFLAKE_ROLE="" \ --env SNOWFLAKE_USER="" \ --env DATABRICKS_HOST="adb-xxx.azuredatabricks.net" \ --env SNOWFLAKE_PASSWORD="" \ --env SNOWFLAKE_WAREHOUSE="COMPUTE_WH" \ --env DATABRICKS_CLIENT_ID="" \ --env DATABRICKS_HTTP_PATH="/sql/1.0/warehouses/xxx" \ --env DATAMESH_MANAGER_HOST="https://api.datamesh-manager.com" \ --env DATABRICKS_CLIENT_SECRET="" \ --env DATAMESH_MANAGER_API_KEY="dmm_live_user_..." \ --env BIGQUERY_CREDENTIALS_PATH="/path/to/service-account-key.json" \ --env QUERY_ACCESS_EVALUATION_ENABLED="true"
How to use
The Data Product MCP server enables AI agents to discover data products in the Data Mesh Manager, understand their data contracts, and request access to data product output ports. It can also execute SQL queries against data platforms for which access has been granted, returning results that the agent can use to answer business questions. The server exposes tools such as dataproduct_search, dataproduct_get, dataproduct_request_access, and dataproduct_query. Use dataproduct_search to locate relevant data products, dataproduct_get to fetch details and contracts, dataproduct_request_access to initiate access approvals, and dataproduct_query to run SQL against authorized data products. These tools are orchestrated through the MCP client configured via the mcpServers section, with the uvx runtime handling the Python-based MCP implementation.
How to install
Prerequisites:
- Ensure you have uv installed. See the Data Mesh Manager UV documentation for installation details.
- Confirm you have a configured MCP client that can load the MCP server configuration (e.g., Claude Desktop or your preferred MCP client).
Installation steps:
- Install uv (Python-based MCP runtime) as documented by the project.
- Add the MCP server configuration to your MCP client configuration file. Example for Claude Desktop or similar clients:
{
"mcpServers": {
"dataproduct": {
"command": "uvx",
"args": [
"dataproduct_mcp"
],
"env": {
"DATAMESH_MANAGER_API_KEY": "dmm_live_user_...",
"DATAMESH_MANAGER_HOST": "https://api.datamesh-manager.com",
"QUERY_ACCESS_EVALUATION_ENABLED": "true",
"SNOWFLAKE_USER": "",
"SNOWFLAKE_PASSWORD": "",
"SNOWFLAKE_ROLE": "",
"SNOWFLAKE_WAREHOUSE": "COMPUTE_WH",
"DATABRICKS_HOST": "adb-xxx.azuredatabricks.net",
"DATABRICKS_HTTP_PATH": "/sql/1.0/warehouses/xxx",
"DATABRICKS_CLIENT_ID": "",
"DATABRICKS_CLIENT_SECRET": "",
"BIGQUERY_CREDENTIALS_PATH": "/path/to/service-account-key.json"
}
}
}
}
- Save and reload your MCP client configuration. The dataproduct MCP server should now be available under the name "dataproduct" and can be invoked by invoking its tools through the MCP client.
Note: The exact file paths and environment values may differ based on your deployment and Data Mesh Manager setup. Always replace placeholders with your real credentials and endpoints.
Additional notes
Tips and considerations:
- Always protect the DATAMESH_MANAGER_API_KEY and data platform credentials. Use environment-specific keys in production.
- If QUERY_ACCESS_EVALUATION_ENABLED is set to false, AI-based query access evaluation will be disabled; ensure governance aligns with your security requirements.
- Ensure Snowflake and Databricks credentials have the minimum required privileges: USAGE on the warehouse/database/schema and appropriate roles for the AI agent when executing queries.
- When using dataproduct_query, you must have active access to the specified output port. Access requests can be managed via dataproduct_request_access.
- Review Data Mesh Manager data contracts and governance rules to understand what data you can access and under what terms of use.
- For self-hosted Data Mesh Manager instances, set DATAMESH_MANAGER_HOST to the appropriate base URL of your deployment.
Related MCP Servers
web-eval-agent
An MCP server that autonomously evaluates web applications.
mcp-neo4j
Neo4j Labs Model Context Protocol servers
Gitingest
mcp server for gitingest
zotero
Model Context Protocol (MCP) server for the Zotero API, in Python
fhir
FHIR MCP Server – helping you expose any FHIR Server or API as a MCP Server.
unitree-go2
The Unitree Go2 MCP Server is a server built on the MCP that enables users to control the Unitree Go2 robot using natural language commands interpreted by a LLM.