Ollama
MCP-Server tools for Ollama
claude mcp add --transport stdio muah1987-ollama-mcp-server python -m ollama_mcp_server
How to use
This MCP server is a Python-based DevOps-focused MCP implementation with full Ollama integration. It exposes a collection of tools across categories (ollama, infrastructure, git, github, browser, and mcp_gateway) and supports multi-agent Ollama endpoints for parallel, specialized reasoning. To use it, clone the repository, install dependencies, and run the Python module entry point to start the MCP server. Once running, you can interact with the MCP gateway to dispatch tool actions, query GitHub via the included API tools, automate browser tasks with Playwright-powered capabilities, and orchestrate multiple Ollama-backed agents for scalable responses.
The server exposes its tools via a consistent MCP interface. You can request context, run commands, and manage state through the MCP gateway, which handles multi-server orchestration. When using Ollama integration, the server can route requests to specialized Ollama endpoints to leverage local or remote LLMs as needed. This makes it suitable for DevOps workflows, such as repository management, infrastructure queries, and browser automation tasks, all coordinated through MCP.
How to install
Prerequisites:
- Python 3.12 or newer
- Git
- (Optional) Docker for containerized deployment
Install steps:
-
Clone the repository git clone https://github.com/muah1987/Ollama-MCP-Server.git cd Ollama-MCP-Server
-
(Recommended) Set up a Python virtual environment python -m venv venv
macOS/Linux
source venv/bin/activate
Windows
venv\Scripts\activate.bat
-
Install Python dependencies pip install -r requirements.txt
If a specific setup script exists, follow docs as well
-
Run the MCP server python -m ollama_mcp_server
-
Optional: run via Makefile targets make setup make dev
If you prefer Docker, follow the Docker deployment docs in the repository:
- Build and run Docker images
- Use docker-compose to start services
Notes:
- Ensure your Python environment has network access for dependencies and for Ollama integration if used.
- Review docs/ for detailed configuration and troubleshooting.
Additional notes
Tips and notes:
- The server relies on Python modules under src/ollama_mcp_server; ensure the module is importable from your PYTHONPATH when running from a virtual environment.
- Ollama integration enables multi-agent endpoints. Configure endpoint URLs or local Ollama instances as described in docs/ to leverage these agents.
- The MCP Gateway functions provide orchestration across multiple servers. Use the gateway docs to set up multi-server workflows.
- If using Docker, you can leverage production-ready multi-stage builds as described in the repository’s Docker docs. Ensure docker-compose.yml is aligned with your environment.
- Common issues: missing dependencies, port conflicts, or misconfigured Ollama endpoints. Check docs/CI_DOCKER_SETUP.md and docs/GITHUB_SECRETS.md for secrets and CI guidance.
Related MCP Servers
zen
Selfhosted notes app. Single golang binary, notes stored as markdown within SQLite, full-text search, very low resource usage
MCP -Deepseek_R1
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
mcp-fhir
A Model Context Protocol implementation for FHIR
mcp
Inkdrop Model Context Protocol Server
mcp-appium-gestures
This is a Model Context Protocol (MCP) server providing resources and tools for Appium mobile gestures using Actions API..
dubco -npm
The (Unofficial) dubco-mcp-server enables AI assistants to manage Dub.co short links via the Model Context Protocol. It provides three MCP tools: create_link for generating new short URLs, update_link for modifying existing links, and delete_link for removing short links.