Get the FREE Ultimate OpenClaw Setup Guide →

ollama -bridge

Extend the Ollama API with dynamic AI tool integration from multiple MCP (Model Context Protocol) servers. Fully compatible, transparent, and developer-friendly, ideal for building powerful local LLM applications, AI agents, and custom chatbots

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio jonigl-ollama-mcp-bridge uv --directory ./mock-weather-mcp-server run main.py \
  --env MCP_LOG_LEVEL="ERROR"

How to use

ollama-mcp-bridge provides an API layer that sits in front of the Ollama API and aggregates tools from multiple MCP servers. At startup, the bridge loads the configured MCP servers and makes all their tools available to Ollama. When you send a chat request to Ollama via the /api/chat endpoint, the bridge proxies the request to Ollama and injects the list of all available tools. If Ollama decides to invoke one or more tools, the bridge routes those tool calls to the appropriate MCP servers, collects the responses, and feeds them back to Ollama in a loop until no further tool calls are needed. The result is streamed back to the client in real time, with tool outputs integrated into the final answer. This enables Ollama to access a wide set of tools from all connected MCP servers transparently, without exposing the underlying tool orchestration.

How to install

Prerequisites:

  • Python 3.10 or newer
  • Ollama installed and running (local or remote)
  • Internet access to install packages

Install the bridge from PyPI using pip:

pip install --upgrade ollama-mcp-bridge

Or install from source (example):

git clone https://github.com/jonigl/ollama-mcp-bridge.git
cd ollama-mcp-bridge
uv install  # or your preferred Python environment tool

Run the bridge:

ollama-mcp-bridge

If you are using Docker Compose, use the provided docker-compose.yml as described in the README to run the bridge alongside Ollama. The bridge expects a configuration file named mcp-config.json in its working directory (you can customize network and environment in that file).

Additional notes

Tips and common considerations:

  • You can connect multiple MCP servers (local stdio, HTTP, or SSE) and Tools are automatically aggregated for Ollama usage.
  • Use toolFilter to limit which tools are exposed from a given server, aiding security and performance.
  • Environment variables such as CORS_ORIGINS (for Docker) and OLLAMA_PROXY_TIMEOUT can be configured to tune behavior when proxied to Ollama.
  • The bridge supports streaming responses, so clients can receive incremental thinking messages and tool results in real time.
  • If you encounter connectivity issues, verify that the MCP server URLs or local processes are reachable and that the configuration file (mcp-config.json) is valid JSON with the necessary fields.
  • Version checks and upgrade instructions are available to ensure you’re running a compatible bridge version with your Ollama deployment.

Related MCP Servers

Sponsor this space

Reach thousands of developers