Get the FREE Ultimate OpenClaw Setup Guide →

mcphost

MCP server for ollama, openai, and anthropic bridging

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio jinsonwu-mcphost-server go run . \
  --env PORT="Port for HTTP server (default 8115)" \
  --env ENV_FILE="Path to environment file or configuration; uses .example.env as template"

How to use

MCPhost exposes an HTTP API to interact with a local LLM (via Ollama) through simple HTTP requests. In server mode, you can send requests to the configured endpoints to generate, complete, or chat with your local language model using the MCP-host bridge. The server runs by default in HTTP-based server mode and listens on port 8115 unless overridden by environment configuration. The tooling focuses on bridging local LLM calls to HTTP, enabling developers to build chat interfaces or automation around Ollama without needing direct local model management in every client.

Typical usage involves starting the server after configuring environment variables, then issuing HTTP requests to endpoints provided by the server to perform tasks such as generating text, continuing conversations, or running model-specific prompts. The setup relies on environment-based configuration and can also be wired through an mcp.json configuration or command-line flags as noted in the README. This makes it straightforward to integrate into existing MCP-based tooling or orchestration pipelines that expect an HTTP-accessible MCP service.

If you need to switch to command-line mode, you can disable server_mode in the code and rebuild to operate without the HTTP endpoints, using the internal CLI instead. The project is designed to bridge local LLM access with HTTP for convenient tooling and automation.

How to install

Prerequisites:

  • Go installed (1.20+ recommended)
  • Access to a local LLM via Ollama if required by your workflow

Installation steps:

  1. Clone the repository: git clone <repository-url> cd mcphost

  2. Copy the example environment file and configure it: cp .example.env .env

    Edit .env to customize PORT, Ollama endpoint, and other settings

  3. Build the server: go build

  4. Run the server: ./mcphost-server

Optional: If you prefer to run via go run (not recommended for production): go run .

Environment variables you may configure in .env or via your environment:

  • PORT: Port on which the HTTP server listens (default 8115)
  • Other application-specific settings as defined by the project (e.g., Ollama endpoint, timeouts)

Notes:

  • The README indicates server mode is enabled by default. Ensure your environment is configured for HTTP access to Ollama if required.

Additional notes

Tips and notes:

  • The server defaults to port 8115 unless overridden by configuration. If you change ports, update any clients accordingly.
  • Environment-based configuration is supported via .example.env/.env, mcp.json, or command-line flags. Ensure required variables (like Ollama endpoint) are correctly set to avoid runtime errors.
  • If you switch to command-line mode, rebuild the project as described in the README to disable server-mode HTTP endpoints.
  • Since the MCP config in this documentation uses a Go-based execution approach, ensure your environment has Go installed and that you run the built binary in an appropriate environment (permissions, PATH).
  • When debugging, check for common issues such as port conflicts, missing Ollama services, or misconfigured environment variables.

Related MCP Servers

Sponsor this space

Reach thousands of developers