vertex -chatbot
Vertex AI chatbot with MCP integration
claude mcp add --transport stdio intertwine-vertex-mcp-chatbot python main.py \ --env ANTHROPIC_API_KEY="your-anthropic-api-key (if using Claude via public API)" \ --env GOOGLE_CLOUD_PROJECT="your-gcp-project-id" \ --env CLAUDE_VERTEX_ENABLED="true or false (default true)" \ --env GOOGLE_CLOUD_LOCATION="us-east1"
How to use
This MCP server implements an interactive Vertex AI chatbot that can use MCP tools autonomously. It supports multiple providers (Claude via Vertex AI or Gemini) and can load MCP tools from an mcp_config.json file to discover and invoke tools during conversations. You can run the CLI in a terminal, connect to the MCP server, and start conversations where Claude can automatically discover and execute tools, access resources, and render responses with markdown formatting. The server is designed to work with Vertex AI credentials and can fall back to Anthropic’s public API if configured. Use the provided commands to start the chat, choose a provider, and enable verbose MCP logging for troubleshooting.
How to install
Prerequisites:
- Python 3.10+ installed
- uv (uvx) package manager installed
- Google Cloud Vertex AI access with MCP-capable regions and appropriate credentials
- Google Cloud CLI (gcloud) installed and authenticated
Installation steps:
- Clone the repository:
git clone https://github.com/intertwine/vertex-mcp-chatbot.git
cd vertex-mcp-chatbot
- Quick setup (installs dependencies and creates .env):
make setup
- Configure environment:
# Edit .env with your GCP project settings, or export variables here
nano .env
- Authenticate with Google Cloud:
make auth
- Run the chatbot (default Claude provider):
make run
Optional: Run specific providers or verbose modes as shown in the README, e.g., make run-gemini or make run-verbose.
Additional notes
Tips and troubleshooting:
- Ensure uv is installed and accessible; the project uses uv sync to set up dependencies and environment files.
- If Claude is configured through Vertex AI, verify CLAUDE_VERTEX_ENABLED, CLAUDE_VERTEX_PROJECT, and CLAUDE_VERTEX_LOCATION environment variables.
- For direct Anthropic API access, provide ANTHROPIC_API_KEY in the environment.
- MCP configuration requires an mcp_config.json in the project root to enable automatic tool discovery and execution.
- If you encounter MCP logging noise, run with make run-verbose or make run-debug for more detailed logs.
- Persistent conversation history is stored on disk; ensure the working directory has write permissions.
- Gemini provider usage differs: tool discovery is manual, whereas Claude via Vertex AI supports automatic MCP tool calling.
Related MCP Servers
mcp-ical
A Model Context Protocol Server that allows you to interact with your MacOS Calendar through natural language.
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
oxylabs
Official Oxylabs MCP integration
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
code-review-assistant
The Code Review Assistant is a simple multi-agent system built using the Model Context Protocol (MCP) and LangChain. Its purpose is to provide automated, preliminary feedback on code snippets, including syntax checking, code explanation, and improvement suggestions. It can be integrated with MCP-compatible clients like Cursor IDE or Claude for Desk
gcp
MCP server for Google Cloud Platform - Complete GCP services integration for GenAI