Get the FREE Ultimate OpenClaw Setup Guide →

goai

AI SDK for building AI-powered applications in Go

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio shaharia-lab-goai docker run -i shaharia-lab/goai \
  --env GOAI_LOG_LEVEL="info" \
  --env GOAI_CONFIG_PATH="path/to/config.yaml (optional)"

How to use

GoAI MCP server exposes a unified interface to interact with multiple LLM providers, vector embeddings, and vector storage capabilities through the MCP (Model Context Protocol). The server is built to work with the GoAI library and provides a standardized way to query, embed, and manage context across different providers, enabling you to mix and match LLMs (e.g., OpenAI, Anthropic, AWS Bedrock) and vector stores with a consistent API. Use the server to perform operations such as generating LLM responses, creating embeddings, and performing similarity searches against a vector store, all through a single MCP-compatible endpoint.

To use, run the MCP server via your preferred method (e.g., Docker). Once running, you can send MCP-compliant requests to the server to initialize LLM providers, generate responses, or fetch embeddings. The tooling enables streaming responses, type-safe operations, and a clean abstraction over multiple providers so you can switch providers without changing your application logic. Check the documentation for MCP endpoint shapes and request/response schemas, and reference the example usage in the repository to build your client against the unified interface.

How to install

Prerequisites:

  • Docker installed (or any compatible container runtime) if using the Docker deployment method.
  • Optional: Go installed if you plan to run or build the GoAI MCP components locally, or to inspect code examples.

Installation steps (Docker-based):

  1. Pull and run the GoAI MCP server via Docker:

    docker run -i shaharia-lab/goai

  2. Verify the server is up by checking logs or hitting the MCP endpoint documented in the repository.

  3. Configure your MCP client to point to the running MCP server endpoint. If you are using environment-based config, set appropriate environment variables as described in the additional notes.

Alternative (local Go-based exploration):

  • If you want to run with Go locally, clone the repository and follow the library’s standard Go module setup:

    go mod download go build ./...

Note: The MCP server in this repository is primarily accessed via the MCP interface provided by the GoAI library; using Docker is a common and portable approach for running the server in production.

Additional notes

Environment variables and configuration options can vary by provider and deployment method. Common considerations:

  • Provide API keys for the LLM providers you intend to use (e.g., ANTHROPIC_API_KEY, OPENAI_API_KEY) through environment variables or a mounted config file.
  • If using Docker, you may want to map a local config file or secrets into the container and set GOAI_CONFIG_PATH accordingly.
  • For vector storage and embeddings, ensure the chosen backend (e.g., PostgreSQL for vectors) is accessible from the MCP server, and that network/firewall rules allow the required connections.
  • Check MCP request/response schemas in the documentation to ensure compatibility with your client implementation.
  • If streaming is required, enable streaming support in the provider configurations and client side to handle incremental results.

Related MCP Servers

Sponsor this space

Reach thousands of developers