bigquery
Practical MCP server for large BigQuery datasets. Supports vector search. Keep LLM context small while staying fast and allowing only safe read-only actions.
claude mcp add --transport stdio pvoo-bigquery-mcp uvx bigquery-mcp --project YOUR_PROJECT --location US
How to use
This MCP server provides a read‑only, LLM-friendly interface to navigate and inspect Google BigQuery data. It exposes tools for discovering datasets and tables, fetching table details (including optional schema and sample data), and safely executing read queries. When used with vector search, you can treat BigQuery as a vector store by loading embeddings and performing semantic similarity queries against embedding-enabled tables. By default, the server is optimized for minimal context usage, listing only dataset and table names, with detailed metadata and schemas retrieved on demand to keep prompts compact for the LLM.
Key capabilities include list_datasets and list_tables for hierarchical discovery, get_table for deep table insights (schema, sample rows, and fill-rate metrics), run_query for safe SELECT/WITH statements, and optional vector_search for embedding-based retrieval. The configuration supports restricting datasets, limiting result counts, and tuning sampling and guardrails to balance speed, cost, and accuracy. To start, provide your Google Cloud project and location, and optionally enable vector search by supplying embedding model and table targets. The MCP client can be used via uvx (recommended) or via the local development flow if you’re cloning the repository.
How to install
Prerequisites:\n- Python 3.10+\n- A working Google Cloud account with BigQuery access\n- uv package manager (uv) or uvx for running MCP servers\n\nInstallation steps (Option A: Quick start from PyPI using uvx):\n1) Authenticate with Google Cloud (if needed):\n gcloud auth application-default login\n2) Install and run the MCP server via uvx:\n uvx bigquery-mcp --project YOUR_PROJECT --location US\n\nInstallation steps (Option B: Local development):\n1) Clone the repository:\n git clone https://github.com/pvoo/bigquery-mcp.git\n cd bigquery-mcp\n2) Install dependencies (via your preferred env):\n python3 -m venv venv && source venv/bin/activate\n pip install -r requirements.txt # if a requirements file exists\n3) Configure environment (copy example and edit):\n cp .env.example .env # then set GCP_PROJECT_ID and BIGQUERY_LOCATION and other vars\n4) Run or inspect locally:\n make run # Start server\n make inspect # Open MCP inspector\n\nPrerequisites recap:\n- Python 3.10+ and uv/uvx installed\n- Access to a Google Cloud project with BigQuery access\n- Proper environment configuration for project and location
Additional notes
Tips and common considerations:\n- By default, the server is read-only and guarded to allow only safe SELECT/WITH queries. Use LIMIT in your queries to control result sizes and cost.\n- You can fine-tune discovery and detail levels with dataset and table filters, and adjust how much sample data or metadata is returned.\n- If you enable vector search, ensure embeddings are prepared in BigQuery and that you provide the embedding-model and embedding-tables options when starting the MCP.\n- Environment variables mirror CLI options (e.g., BIGQUERY_LOCATION, GCP_PROJECT_ID, BIGQUERY_SAMPLE_ROWS). Use them to simplify deployment in containerized or cloud environments.\n- For troubleshooting, verify that your authentication method (ADC or key file) is correctly set and that the BigQuery API is enabled for the project.
Related MCP Servers
claude-talk-to-figma
A Model Context Protocol (MCP) that allows Claude Desktop and other AI tools (Claude Code, Cursor, Antigravity, etc.) to read, analyze, and modify Figma designs
cursor-notebook
Model Context Protocol (MCP) server designed to allow AI agents within Cursor to interact with Jupyter Notebook (.ipynb) files
ollama
An MCP Server for Ollama
mcp-rest-api
A TypeScript-based MCP server that enables testing of REST APIs through Cline. This tool allows you to test and interact with any REST API endpoints directly from your development environment.
statelessagent
Your AI forgets everything between sessions. SAME fixes that. Local-first, no API keys, single binary.
voice-status-report
A Model Context Protocol (MCP) server that provides voice status updates using OpenAI's text-to-speech API.