Get the FREE Ultimate OpenClaw Setup Guide →

Ollama

MCP-Server tools for Ollama

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio muah1987-ollama-mcp-server python -m ollama_mcp_server

How to use

This MCP server is a Python-based DevOps-focused MCP implementation with full Ollama integration. It exposes a collection of tools across categories (ollama, infrastructure, git, github, browser, and mcp_gateway) and supports multi-agent Ollama endpoints for parallel, specialized reasoning. To use it, clone the repository, install dependencies, and run the Python module entry point to start the MCP server. Once running, you can interact with the MCP gateway to dispatch tool actions, query GitHub via the included API tools, automate browser tasks with Playwright-powered capabilities, and orchestrate multiple Ollama-backed agents for scalable responses.

The server exposes its tools via a consistent MCP interface. You can request context, run commands, and manage state through the MCP gateway, which handles multi-server orchestration. When using Ollama integration, the server can route requests to specialized Ollama endpoints to leverage local or remote LLMs as needed. This makes it suitable for DevOps workflows, such as repository management, infrastructure queries, and browser automation tasks, all coordinated through MCP.

How to install

Prerequisites:

  • Python 3.12 or newer
  • Git
  • (Optional) Docker for containerized deployment

Install steps:

  1. Clone the repository git clone https://github.com/muah1987/Ollama-MCP-Server.git cd Ollama-MCP-Server

  2. (Recommended) Set up a Python virtual environment python -m venv venv

    macOS/Linux

    source venv/bin/activate

    Windows

    venv\Scripts\activate.bat

  3. Install Python dependencies pip install -r requirements.txt

    If a specific setup script exists, follow docs as well

  4. Run the MCP server python -m ollama_mcp_server

  5. Optional: run via Makefile targets make setup make dev

If you prefer Docker, follow the Docker deployment docs in the repository:

  • Build and run Docker images
  • Use docker-compose to start services

Notes:

  • Ensure your Python environment has network access for dependencies and for Ollama integration if used.
  • Review docs/ for detailed configuration and troubleshooting.

Additional notes

Tips and notes:

  • The server relies on Python modules under src/ollama_mcp_server; ensure the module is importable from your PYTHONPATH when running from a virtual environment.
  • Ollama integration enables multi-agent endpoints. Configure endpoint URLs or local Ollama instances as described in docs/ to leverage these agents.
  • The MCP Gateway functions provide orchestration across multiple servers. Use the gateway docs to set up multi-server workflows.
  • If using Docker, you can leverage production-ready multi-stage builds as described in the repository’s Docker docs. Ensure docker-compose.yml is aligned with your environment.
  • Common issues: missing dependencies, port conflicts, or misconfigured Ollama endpoints. Check docs/CI_DOCKER_SETUP.md and docs/GITHUB_SECRETS.md for secrets and CI guidance.

Related MCP Servers

Sponsor this space

Reach thousands of developers