Get the FREE Ultimate OpenClaw Setup Guide →

vertex-ai

MCP server from shariqriazz/vertex-ai-mcp-server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio shariqriazz-vertex-ai-mcp-server node build/index.js \
  --env AI_PROVIDER="set to 'vertex' or 'gemini'" \
  --env GEMINI_API_KEY="your Gemini API key (if using AI_PROVIDER='gemini')" \
  --env VERTEX_AI_MODEL_ID="your Vertex AI Gemini model ID" \
  --env READINESS_CHECK_URL="optional health check URL" \
  --env GOOGLE_CLOUD_PROJECT="your-google-cloud-project-id" \
  --env GOOGLE_CLOUD_LOCATION="your-project-location" \
  --env VERTEX_AI_TEMPERATURE="optional, default as configured" \
  --env GOOGLE_CLOUD_API_ENDPOINT="optional, custom Vertex AI endpoint" \
  --env VERTEX_AI_MAX_OUTPUT_TOKENS="optional, default as configured" \
  --env GOOGLE_APPLICATION_CREDENTIALS="path/to/your-service-account.json (if not using ADC)"

How to use

This MCP server exposes tools to interact with Google Cloud Vertex AI Gemini models for coding assistance and general queries. After building and running, you can access features such as answer_query_websearch to ground responses with web results, answer_query_direct to rely on internal model knowledge, and a suite of analysis and documentation tools (code_analysis_with_docs, dependency_vulnerability_scan, architecture_pattern_recommendation, etc.). The server is designed to stream responses by default for responsiveness and includes lightweight retry logic and minimal safety filters (BLOCK_NONE). To start, ensure prerequisites are met (Node.js v18+, Bun, proper Google Cloud credentials) and run the server using the provided command. You can then call any of the MCP tools via the server’s API endpoints configured by your MCP bridge or client.

How to install

Prerequisites:

  • Node.js v18+ (or Bun installed as a Node-compatible runner)
  • Bun (for installation and build steps)
  • Google Cloud Project with Vertex AI API enabled and proper credentials

Installation steps:

  1. Clone or place the project files in your desired directory
  2. Install dependencies:
bun install
  1. Configure environment variables:
  • Copy example env file if provided (e.g., .env.example) to .env
  • Edit .env with required values:
    • AI_PROVIDER=vertex or gemini
    • GOOGLE_CLOUD_PROJECT=your-project-id
    • GOOGLE_CLOUD_LOCATION=your-location
    • GOOGLE_APPLICATION_CREDENTIALS=path/to/credentials.json (if not using ADC)
    • GEMINI_API_KEY=your-gemini-api-key (if AI_PROVIDER=gemini)
4) Build the server:
```bash
bun run build

This outputs build/index.js which is the runnable entry point. 5) Run the MCP server:

node build/index.js

Optionally, you can use NPX or Smithery-based installation once published:

  • NPX: bunx vertex-ai-mcp-server (requires environment variables)
  • Smithery: bunx -y @smithery/cli install @shariqriazz/vertex-ai-mcp-server --client claude

Notes:
- Ensure authentication is available via ADC or SERVICE_ACCOUNT_KEY as needed.
- If you use Vertex AI, set AI_PROVIDER=vertex and GOOGLE_CLOUD_PROJECT is required.
- If you use Gemini, set AI_PROVIDER=gemini and GEMINI_API_KEY is required.

Additional notes

Tips and common issues:

  • If the server fails to start due to missing environment variables, verify .env values and that they are loaded in your process environment.
  • For Vertex AI, the GOOGLE_CLOUD_LOCATION should match the location of your Vertex AI resources.
  • The default streaming behavior generally provides faster responses; if you encounter timeouts, you may adjust max output tokens or streaming flags in environment configuration.
  • If using Gemini, securely manage GEMINI_API_KEY and restrict access to the credentials.
  • Use the provided MCP tools via your MCP bridge client to leverage websearch grounding, internal knowledge querying, code analysis, and docs snippet extraction.

Related MCP Servers

Sponsor this space

Reach thousands of developers