Get the FREE Ultimate OpenClaw Setup Guide →

iceberg

MCP server from cloudera/iceberg-mcp-server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio cloudera-iceberg-mcp-server uvx --from git+https://github.com/cloudera/iceberg-mcp-server@main run-server \
  --env IMPALA_HOST="coordinator-default-impala.example.com" \
  --env IMPALA_PORT="443" \
  --env IMPALA_USER="username" \
  --env IMPALA_DATABASE="default" \
  --env IMPALA_PASSWORD="password"

How to use

This MCP server provides read-only access to Iceberg tables via Apache Impala. It exposes two core capabilities: execute_query(query: str) to run a SQL query on Impala and return results as JSON, and get_schema() to list all available tables in the current Impala database. You can integrate this server with AI tooling (e.g., Claude Desktop, LangChain, LangGraph) to enable LLMs to inspect schemas and perform safe, read-only queries against your Iceberg data. Configure the server in your Claude Desktop mcpServers section or run it directly in your environment using the provided options. When using Claude Desktop, you can either install directly from GitHub or run locally after cloning, both providing the same capabilities through the MCP transport.

How to install

Prerequisites:

  • Java or Python runtime as required by the underlying MCP server (as configured by uvx and the repository).
  • uvx (Python microservice runner) installed on your system.
  • Git for cloning from GitHub (if opting for Option 2).

Option 1 — Direct installation from GitHub (Recommended):

  1. Ensure uvx is installed and available in your PATH.
  2. Use the configuration snippet from the README in Claude Desktop or your environment:
{
  "mcpServers": {
    "iceberg-mcp-server": {
      "command": "uvx",
      "args": [
        "--from",
        "git+https://github.com/cloudera/iceberg-mcp-server@main",
        "run-server"
      ],
      "env": {
        "IMPALA_HOST": "coordinator-default-impala.example.com",
        "IMPALA_PORT": "443",
        "IMPALA_USER": "username",
        "IMPALA_PASSWORD": "password",
        "IMPALA_DATABASE": "default"
      }
    }
  }
}
  1. Start Claude Desktop and ensure it can reach the MCP server endpoint via the configured transport (stdio/http/sse).

Option 2 — Local installation after cloning the repository:

  1. Clone the repository and install dependencies as required by the project (follow repository guidelines).
  2. Run the server using uvx with the local path:
{
  "mcpServers": {
    "iceberg-mcp-server": {
      "command": "uv",
      "args": [
        "--directory",
        "/path/to/iceberg-mcp-server",
        "run",
        "src/iceberg_mcp_server/server.py"
      ],
      "env": {
        "IMPALA_HOST": "coordinator-default-impala.example.com",
        "IMPALA_PORT": "443",
        "IMPALA_USER": "username",
        "IMPALA_PASSWORD": "password",
        "IMPALA_DATABASE": "default"
      }
    }
  }
}
  1. Replace /path/to with your actual repository path and ensure Python dependencies are installed as per the repository’s setup instructions.

Prerequisites summary:

  • Access to Impala with host/port/user/password and the suitable database configured as default.
  • uvx installed (or the uvx-compatible runner you plan to use).
  • Network access from your machine to IMPALA_HOST:IMPALA_PORT.

Additional notes

Tips and caveats:

  • The MCP server is read-only for safety; execute_query will run arbitrary SQL but ensure proper permissions and database security practices in your Impala configuration.
  • If using Option 1, ensure the git URL is reachable and that the main branch contains a working run-server entry point.
  • Transport mode can be configured via the MCP_TRANSPORT environment variable (stdio, http, sse). If exposing over HTTP, ensure your network/firewall rules permit access.
  • For sensitive credentials (IMPALA_PASSWORD), consider using a secrets manager or environment variable injection mechanism rather than embedding in configuration files.
  • The examples folder contains integration patterns with LangChain/LangGraph and OpenAI SDK, which can help you wire MCP responses into larger AI workflows.

Related MCP Servers

Sponsor this space

Reach thousands of developers