ollama -bridge
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
claude mcp add --transport stdio patruff-ollama-mcp-bridge node path/to/server-flux/dist/index.js \ --env REPLICATE_API_TOKEN="your_replicate_token_here"
How to use
This MCP bridge connects local LLMs (via Ollama) to multiple MCP servers that provide capabilities such as filesystem manipulation, web search, GitHub interactions, memory storage, image generation, and Gmail/Drive integration. The bridge translates between the LLM's JSON-RPC tool calls and the MCP servers’ endpoints, enabling an Ollama-hosted model to invoke tools in a structured, Claude-like manner. To use it, start the bridge, ensure the MCP servers are running and properly authenticated, and then interact with the LLM prompts. The bridge will detect and route tool requests to the appropriate MCP based on the prompt context and tool permissions, returning structured results that the LLM can present to the user.
How to install
Prerequisites:
- Node.js v14+ installed
- Ollama installed and a compatible LLM pulled
- npm installed
- Install the MCP servers globally (example shown for each tool):
npm install -g @modelcontextprotocol/server-filesystem
npm install -g @modelcontextprotocol/server-brave-search
npm install -g @modelcontextprotocol/server-github
npm install -g @modelcontextprotocol/server-memory
npm install -g @patruff/server-flux
npm install -g @patruff/server-gmail-drive
- Install and run the Ollama bridge project (if you have access to the repository):
git clone https://github.com/your-org/ollama-llm-mcp-bridge.git
cd ollama-llm-mcp-bridge
npm install
- Configure credentials (as described in the README):
- Brave: set BRAVE_API_KEY
- GitHub: set GITHUB_PERSONAL_ACCESS_TOKEN
- Flux: set REPLICATE_API_TOKEN
- Gmail/Drive: run the auth flow, e.g.
node path/to/gmail-drive/index.js author similar per project
- Create the bridge configuration file (bridge_config.json) with MCP server definitions and LLM settings as shown in the README, then start the bridge:
npm run start
- Start Ollama with the desired model as described in the project documentation and test tool calls.
Additional notes
Tips:
- Ensure environment variables for API tokens are kept secure. Consider using a .env file or secret manager where appropriate.
- Validate MCP server permissions to avoid unintended tool access.
- If a tool is not responding, check the corresponding MCP server log and ensure the server is reachable from the bridge.
- For local deployments, ensure file system paths (WORKSPACE_ROOT, etc.) are correctly mounted and accessible by Node processes.
- The bridge supports dynamic tool routing; prompts containing specific keywords or patterns may automatically route to the appropriate MCP (e.g., email-like prompts to Gmail, drive-related prompts to Drive, etc.).
Related MCP Servers
iterm
A Model Context Protocol server that executes commands in the current iTerm session - useful for REPL and CLI assistance
mcp
Octopus Deploy Official MCP Server
furi
CLI & API for MCP management
editor
MCP Server for Phaser Editor
DoorDash
MCP server from JordanDalton/DoorDash-MCP-Server
mcp
MCP сервер для автоматического создания и развертывания приложений в Timeweb Cloud