Get the FREE Ultimate OpenClaw Setup Guide →

vertex -chatbot

Vertex AI chatbot with MCP integration

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio intertwine-vertex-mcp-chatbot python main.py \
  --env ANTHROPIC_API_KEY="your-anthropic-api-key (if using Claude via public API)" \
  --env GOOGLE_CLOUD_PROJECT="your-gcp-project-id" \
  --env CLAUDE_VERTEX_ENABLED="true or false (default true)" \
  --env GOOGLE_CLOUD_LOCATION="us-east1"

How to use

This MCP server implements an interactive Vertex AI chatbot that can use MCP tools autonomously. It supports multiple providers (Claude via Vertex AI or Gemini) and can load MCP tools from an mcp_config.json file to discover and invoke tools during conversations. You can run the CLI in a terminal, connect to the MCP server, and start conversations where Claude can automatically discover and execute tools, access resources, and render responses with markdown formatting. The server is designed to work with Vertex AI credentials and can fall back to Anthropic’s public API if configured. Use the provided commands to start the chat, choose a provider, and enable verbose MCP logging for troubleshooting.

How to install

Prerequisites:

  • Python 3.10+ installed
  • uv (uvx) package manager installed
  • Google Cloud Vertex AI access with MCP-capable regions and appropriate credentials
  • Google Cloud CLI (gcloud) installed and authenticated

Installation steps:

  1. Clone the repository:
git clone https://github.com/intertwine/vertex-mcp-chatbot.git
cd vertex-mcp-chatbot
  1. Quick setup (installs dependencies and creates .env):
make setup
  1. Configure environment:
# Edit .env with your GCP project settings, or export variables here
nano .env
  1. Authenticate with Google Cloud:
make auth
  1. Run the chatbot (default Claude provider):
make run

Optional: Run specific providers or verbose modes as shown in the README, e.g., make run-gemini or make run-verbose.

Additional notes

Tips and troubleshooting:

  • Ensure uv is installed and accessible; the project uses uv sync to set up dependencies and environment files.
  • If Claude is configured through Vertex AI, verify CLAUDE_VERTEX_ENABLED, CLAUDE_VERTEX_PROJECT, and CLAUDE_VERTEX_LOCATION environment variables.
  • For direct Anthropic API access, provide ANTHROPIC_API_KEY in the environment.
  • MCP configuration requires an mcp_config.json in the project root to enable automatic tool discovery and execution.
  • If you encounter MCP logging noise, run with make run-verbose or make run-debug for more detailed logs.
  • Persistent conversation history is stored on disk; ensure the working directory has write permissions.
  • Gemini provider usage differs: tool discovery is manual, whereas Claude via Vertex AI supports automatic MCP tool calling.

Related MCP Servers

Sponsor this space

Reach thousands of developers