ollama
MCP server from m-mehdi14/ollama-mcp-server
claude mcp add --transport stdio m-mehdi14-ollama-mcp-server node /usr/local/lib/node_modules/@muhammadmehdi/ollama-mcp-server/dist/index.js \ --env OLLAMA_BASE_URL="http://localhost:11434"
How to use
This MCP server provides a universal bridge between local Ollama instances and any MCP-compatible IDE or tool. It exposes a standardized set of MCP endpoints that let clients list models, get model information, chat with models, generate text, and manage models from your local Ollama installation. The server reads your local Ollama instance (default at http://localhost:11434) and presents Ollama’s capabilities through the MCP interface so that editors like Cursor IDE, Claude Desktop, and other MCP-enabled apps can interoperate without custom adapters. To use it, configure your MCP client to point at the Ollama MCP server (the path shown in mcp_config) and ensure Ollama is running locally. The client will automatically start/stop the server as needed by most MCP-enabled IDEs.
How to install
Prerequisites:
- Node.js 18 or newer
- npm 8 or newer (comes with Node.js)
- Ollama installed and running locally (default URL http://localhost:11434)
Option A: Install globally (recommended for quick start)
- Install the MCP server globally:
npm install -g @muhammadmehdi/ollama-mcp-server
- Ensure the server command is in your PATH. The example configuration uses a global path to dist/index.js; your actual path may vary depending on your system.
- In your MCP client configuration, point to the global installation path (as shown in the README examples) and set OLLAMA_BASE_URL to http://localhost:11434.
Option B: Install locally in a project
- Install locally:
npm install @muhammadmehdi/ollama-mcp-server
- Reference the server executable from your project (path will be inside node_modules/@muhammadmehdi/ollama-mcp-server/dist/index.js).
- Provide the same MCP server configuration in your client, and set the environment variable OLLAMA_BASE_URL as needed.
Prerequisites recap:
- Node.js >= 18
- Ollama running locally
- Appropriate network access to http://localhost:11434
Additional notes
- Ensure Ollama is running before starting the MCP client configured with this server.
- If Ollama is not at the default URL, update OLLAMA_BASE_URL in the MCP server config accordingly.
- The MCP server relies on Ollama’s HTTP API (default base http://localhost:11434). Verify connectivity with curl http://localhost:11434/api/tags.
- The npm package name is @muhammadmehdi/ollama-mcp-server; you can use either a global or local installation depending on your workflow.
- When upgrading, recheck the dist/index.js path if your installation layout changes; update the mcp_config accordingly.
Related MCP Servers
zen
Selfhosted notes app. Single golang binary, notes stored as markdown within SQLite, full-text search, very low resource usage
MCP -Deepseek_R1
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
mcp-fhir
A Model Context Protocol implementation for FHIR
mcp
Inkdrop Model Context Protocol Server
mcp-appium-gestures
This is a Model Context Protocol (MCP) server providing resources and tools for Appium mobile gestures using Actions API..
dubco -npm
The (Unofficial) dubco-mcp-server enables AI assistants to manage Dub.co short links via the Model Context Protocol. It provides three MCP tools: create_link for generating new short URLs, update_link for modifying existing links, and delete_link for removing short links.