mcp -ollama
MCP server for connecting Claude Desktop to Ollama LLM server
claude mcp add --transport stdio vincentf305-mcp-server-ollama python -m src.mcp_server.server \ --env PYTHONPATH="path-to-mcp-server"
How to use
This MCP server bridges Claude Desktop with the Ollama LLM server by exposing a Model Control Protocol endpoint that Claude can query. The server runs as a Python module entry point (src.mcp_server.server) and relies on the Ollama backend being available locally. Once started, Claude Desktop can send MCP requests to the configured endpoint to generate or reason with prompts via Ollama, enabling seamless integration between Claude’s UI and the Ollama runtime. The PYTHONPATH environment variable should point to the root path of the MCP server repository to ensure the module import works correctly.
To use, start the server using the provided configuration (ensuring Ollama is running if required by your Ollama setup). In Claude Desktop, configure the MCP connection using the example mcpServers["ollama-server"] entry, then point Claude at the server’s address. You can then issue standard MCP commands to send prompts, fetch responses, and handle streaming or token-by-token results depending on the server implementation.
How to install
Prerequisites:
- Python 3.8+ installed on your machine
- Git to clone the repository
- Ollama (if required by your Ollama setup) running locally
Installation steps:
-
Clone the MCP Ollama server repository: git clone <repository-url> cd <repository-root>
-
Copy the example environment file and configure it as needed: cp .env.example .env
edit .env to set any needed environment values
-
Install Python dependencies: python -m pip install --upgrade pip pip install -r requirements.txt
-
Ensure Ollama is running (if your setup requires it) and that the Ollama endpoint is accessible.
-
Start the MCP server using the provided command configuration (as shown in the mcp_config):
Example (from the README):
python -m src.mcp_server.server
-
In Claude Desktop, configure the MCP connection to point to the running server as described in the README.
Additional notes
Environment and configuration tips:
- The server relies on PYTHONPATH to locate the module path; set path-to-mcp-server to the root of this repository when configuring Claude Desktop.
- If Ollama requires specific host/port settings, adjust the underlying Ollama connection within the server code or via environment variables as needed.
- Ensure that dependencies in requirements.txt are met; consider using a virtual environment to isolate this MCP server’s Python environment.
- If you encounter import errors, verify that the working directory and PYTHONPATH align with the module structure (src.mcp_server.server).
- For debugging, run the server in a console to observe logs and confirm that MCP requests from Claude Desktop are being received and processed.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP