Get the FREE Ultimate OpenClaw Setup Guide →

sap-bdc

MCP server for SAP Business Data Cloud (BDC) integration - 7 tools for Delta Sharing, data product publishing, validation, and orchestration

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio mariodefelipe-sap-bdc-mcp-server python -m sap_bdc_mcp.server \
  --env LOG_LEVEL="INFO" \
  --env DATABRICKS_HOST="https://your-workspace.cloud.databricks.com" \
  --env DATABRICKS_TOKEN="your_databricks_token" \
  --env DATABRICKS_RECIPIENT_NAME="your_recipient_name"

How to use

This MCP server provides an integration layer with SAP Business Data Cloud (BDC) to enable AI assistants to interact with Delta Sharing enabled data and manage data products. It exposes seven tools for common SAP BDC operations, including creating or updating shares, managing CSN schemas, publishing data products, deleting shares, generating CSN templates from Databricks shares, and two orchestration tools for end-to-end provisioning and pre-flight validation. You can run the server with Python and then connect from clients (like Claude Desktop) by referencing the sap-bdc MCP server entry in your configuration. The included tools expose a structured API for share management and Delta Sharing workflows, making it easier to automate data provisioning and governance tasks.

Key capabilities include:

  • create_or_update_share and create_or_update_share_csn for managing ORD/CSN configurations
  • publish_data_product to publish or unpublish data products tied to a share
  • delete_share to remove shares and their registrations
  • generate_csn_template to auto-create CSN templates from Databricks shares
  • provision_share for end-to-end provisioning (create share, grant access, register with SAP BDC)
  • validate_share_readiness (pre-flight validation) to ensure shares are ready before registration

To use the Python-based server, run the server as a stdio service and then configure your client (e.g., Claude Desktop or a custom app) to point to the sap-bdc entry with the appropriate environment variables set. If you’re using the Node.js wrapper, you can reference the same env vars, noting that Python must be available as a dependency.

How to install

Prerequisites:

  • Python 3.9+ (3.11+ recommended for local development)
  • Access to a Databricks environment
  • SAP Business Data Cloud account
  • Databricks Delta Sharing configured recipient
  • Optional: Databricks personal access token for local development

Installation steps:

  1. Clone the repository (or use the PyPI package):
git clone https://github.com/MarioDeFelipe/sap-bdc-mcp-server.git
cd sap-bdc-mcp-server
  1. Install the Python package in editable mode for development:
pip install -e .
  1. Install optional Node.js dependencies if you plan to work with the Node wrapper (not required for Python usage):
npm install
  1. Ensure dependencies are installed and your environment variables are set (see mcp_config example for needed vars). You can also install from PyPI directly:
pip install sap-bdc-mcp-server
  1. Run the MCP server:
python -m sap_bdc_mcp.server

Alternatively, if using the installed script:

sap-bdc-mcp

Additional notes

Tips and common considerations:

  • Environment variables: DATABRICKS_RECIPIENT_NAME is required. DATABRICKS_HOST and DATABRICKS_TOKEN are needed for Databricks access; set them in your .env or your deployment environment.
  • If running inside Databricks notebooks, you can minimize required config by relying on dbutils; otherwise use LocalDatabricksClient with a PAT.
  • The provision_share tool defaults auto_grant to true and skip_if_exists to true; adjust these as needed for your governance policy.
  • For Claude Desktop integration, ensure the same sap-bdc configuration is reflected in your claude_desktop_config.json, including the env mappings.
  • Node.js usage requires Python to be installed; the Node package wraps the Python MCP server and forwards calls.
  • Logging: set LOG_LEVEL to DEBUG during troubleshooting to get more verbose output.
  • Operational note: the End-to-End provisioning (provision_share) orchestrates multiple steps (create share, add tables, grant, register) and may require proper IAM/recipient configuration in SAP BDC and Databricks.

Related MCP Servers

Sponsor this space

Reach thousands of developers