Get the FREE Ultimate OpenClaw Setup Guide →

databricks

MCP server from revodatanl/databricks-mcp-server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio revodatanl-databricks-mcp-server docker run -i --rm -e DATABRICKS_HOST -e DATABRICKS_TOKEN ghcr.io/revodatanl/databricks-mcp-server:latest \
  --env DATABRICKS_HOST="${env:DATABRICKS_HOST}" \
  --env DATABRICKS_TOKEN="${env:DATABRICKS_TOKEN}"

How to use

This MCP server provides read-only access to your Databricks workspace via the Unity Catalog metadata and Databricks Jobs. Built on the Databricks SDK and powered by FastMCP, it enables LLMs and tooling like Claude and Continue.dev to query catalogs, schemas, tables, and job configurations/results without executing data-altering operations. The server exposes tools for Unity Catalog (listing catalogs/schemas/tables and retrieving detailed table metadata) and for Jobs (listing jobs, inspecting job configurations, and fetching recent runs). Use it as a read-only data surface to inform queries, data discovery, and automation prompts.

How to install

Prerequisites:

  • Docker installed and running
  • Access to a Databricks workspace with host URL and access token

Step 1: Obtain credentials

  • Set DATABRICKS_HOST to your Databricks workspace URL (e.g., https://<region>.cloud.databricks.com)
  • Set DATABRICKS_TOKEN to a valid Databricks access token

Step 2: Run the MCP server via Docker

  • Ensure environment variables are available in your shell or CI environment
  • Start the MCP server using the provided Docker command in the mcp_config (server named 'databricks')

Example commands (copy/paste):

  • In your shell exporting env vars: export DATABRICKS_HOST=https://<your-workspace>.cloud.databricks.com export DATABRICKS_TOKEN=<your-access-token>

  • Then run the server (as defined by the MCP config): docker run -i --rm -e DATABRICKS_HOST -e DATABRICKS_TOKEN ghcr.io/revodatanl/databricks-mcp-server:latest

Step 3: Verify connectivity

  • Ensure the MCP client (Cursor, Continue.dev, or your integration) can reference the server via the configured mcp.json/mcp configuration and that the environment variables are correctly resolved.

Step 4: Optional local development (if you want to run outside Docker)

  • The repository indicates a Docker-based deployment; if you want to run locally, follow the same env-variable pattern with your preferred container runner or adapt to a Python-based uvx setup if/when available in your environment.

Additional notes

  • The MCP server operates in read-only mode: it retrieves Unity Catalog metadata and job configurations/runs, but does not modify any resources.
  • If you use Cursor, you can reference environment variables via ${env:VAR} in the mcp.json configuration or hardcode values directly.
  • Ensure your Databricks host URL and access token have the necessary permissions to read metadata.
  • When using Continue.dev, you may embed credentials securely or leverage their secrets management to populate DATABRICKS_HOST and DATABRICKS_TOKEN.
  • If you encounter connectivity issues, verify network access to your Databricks workspace and confirm that the Docker image ghcr.io/revodatanl/databricks-mcp-server:latest is reachable.

Related MCP Servers

Sponsor this space

Reach thousands of developers