godoctor
A Model Context Protocol server for Go developers
claude mcp add --transport stdio danicat-godoctor docker run -i godoctor-image \ --env GEMINI_API_KEY="Your Gemini API key (if using Gemini API)" \ --env GOOGLE_CLOUD_PROJECT="your-project-id (required for Vertex AI)" \ --env GOOGLE_CLOUD_LOCATION="us-central1 (or your region)" \ --env GOOGLE_GENAI_USE_VERTEXAI="true or false (set to true if using Vertex AI)"
How to use
GoDoctor is an MCP server designed to assist Go developers with navigation, editing, analysis, and modernization of Go codebases. It exposes domain-specific tools such as list_files for project exploration, smart_read for content and outline inspection, file_create and smart_edit for code creation and targeted edits, and Go toolchain integrations like smart_build, add_dependency, read_docs, and modernize_code. Additionally, code_review provides expert AI analysis for idiomatic correctness and best practices. To use the server, run it in your preferred environment and connect your MCP client to the exposed endpoint. Once connected, you can issue tool commands corresponding to each capability (e.g., list_files to survey a repository, smart_build to verify build health, or read_docs to fetch package documentation).
The tools are designed to work cohesively: smart_build can automatically tidy modules and format code, add_dependency fetches package docs as it updates go.mod, and modernize_code helps migrate legacy patterns to modern Go practices. The code_review tool offers AI-driven feedback on correctness and style, making it useful for code audits and refactoring sessions. If you’re deploying locally via Docker, you’ll typically interact with the MCP transport over HTTP or stdio depending on configuration, and you can enable or disable tools via flags as needed.
How to install
Prerequisites:
- Go (for building GoDoctor from source if you choose to compile locally)
- Docker (for containerized deployment) or your preferred runtime (e.g., Go binary or cloud deployment)
- Optional Gemini API key or Vertex AI setup if you plan to use external AI services
Option 1: Run via Docker (recommended for MCP server workflows)
- Ensure Docker is installed and running.
- Pull or build the GoDoctor image (example with a prebuilt image):
docker pull godoctor-image:latest
or build your own image from a Dockerfile if provided
- Run the container: docker run -i godoctor-image:latest
Option 2: Build and run locally from source (Go)
- Prerequisites: Go installed, Git
- Clone the repository and navigate into it: git clone https://github.com/danicat/godoctor.git cd godoctor
- Build the project (if a Makefile is provided):
make build
or if using a standard Go build:
go build -o bin/godoctor ./... - Run the server (Stdio mode by default): ./bin/godoctor
Option 3: Deploy to Google Cloud Run (containerized)
- Ensure Google Cloud SDK is installed and authenticated.
- Use the provided deploy scripts (as described in the Cloud Deployment section of the README) to build, push, and deploy the container.
- Retrieve the Service URL from the deployment output and connect your MCP client to that endpoint.
Additional notes
Environment variables describeable in deployment contexts:
- GEMINI_API_KEY or GOOGLE_API_KEY for Gemini access
- GOOGLE_GENAI_USE_VERTEXAI, GOOGLE_CLOUD_PROJECT, GOOGLE_CLOUD_LOCATION when using Vertex AI
Common issues:
- If using stdio mode, ensure the MCP client is configured to communicate over stdio and the server is started in stdio mode.
- When enabling specific tools, verify network access and API keys for the AI backends.
- For Cloud Run deployments, ensure proper IAM permissions (Artifact Registry Admin, Cloud Run Admin, Secret Manager Admin, Vertex AI User as applicable).
Configuration tips:
- Use the --list-tools flag to inspect available tools and their descriptions.
- Use --agents to output agent/system prompts for integrator guidance.
- KeepGemini/Vertex AI credentials secure using Secret Manager where possible.
Related MCP Servers
trpc-agent-go
trpc-agent-go is a powerful Go framework for building intelligent agent systems using large language models (LLMs) and tools.
station
Station is our open-source runtime that lets teams deploy agents on their own infrastructure with full control.
tiger-cli
Tiger CLI is the command-line interface for Tiger Cloud. It includes an MCP server for helping coding agents write production-level Postgres code.
gopls
MCP server for golang projects development: Expand AI Code Agent ability boundary to have a semantic understanding and determinisic information for golang projects.
kubernetes
A Model Context Protocol (MCP) server for the Kubernetes API.
gcp-cost
💰 An MCP server that enables AI assistants to estimate Google Cloud costs, powered by Cloud Billing Catalog API and built with Genkit for Go