Get the FREE Ultimate OpenClaw Setup Guide →

mcp-ollama-agent

A TypeScript example showcasing the integration of Ollama with the Model Context Protocol (MCP) servers. This project provides an interactive command-line interface for an AI agent that can utilize the tools from multiple MCP Servers..

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio ausboss-mcp-ollama-agent npx @modelcontextprotocol/server-filesystem ./

How to use

This MCP server integrates Ollama with MCP-enabled tools to provide a unified AI assistant experience. It wires up two built-in tool providers: a filesystem tool and a web research tool, both exposed as MCP servers that can be invoked by compatible LLMs via function calling. The filesystem tool lets the model inspect directories, list files, read and write content, and perform basic filesystem operations. The web research tool allows the model to perform targeted searches and gather information from the web through the MCP interface. The Ollama integration lets you run a local Ollama model that can call these tools during chat, enabling dynamic tool usage during conversations. You can also run a standalone demo mode to test tools without an LLM, or start the chat interface to interact with an Ollama-backed model in real time.

To use, start the MCP server configuration by ensuring the MCP servers are available as defined in mcp-config.json, and ensure Ollama is running at the configured host. If you’re testing tooling without an LLM, run the demo via the provided script to exercise the filesystem and web research tools. For full interaction, launch the chat UI and converse with the model, which can call the configured tools to perform actions like listing directories, reading files, and researching topics on the web.

How to install

Prerequisites:

  • Node.js 18+ (install from https://nodejs.org)
  • Ollama installed and running (ensure ollama is accessible at the configured host)
  • Global npm tools for the MCP components you plan to use (examples shown below)

Steps:

  1. Clone the repository and install dependencies
git clone https://github.com/ausboss/mcp-ollama-agent.git
cd mcp-ollama-agent
npm install
  1. Install MCP tools globally (filesystem and web research)
# For filesystem operations
npm install -g @modelcontextprotocol/server-filesystem

# For web research
npm install -g @mzxrai/mcp-webresearch
  1. Configure mcp-config.json
{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["@modelcontextprotocol/server-filesystem", "./"]
    },
    "webresearch": {
      "command": "npx",
      "args": ["-y", "@mzxrai/mcp-webresearch"]
    }
  },
  "ollama": {
    "host": "http://localhost:11434",
    "model": "qwen2.5:latest"
  }
}
  1. Run the demo to test filesystem and webresearch tools without an LLM
npx tsx ./src/demo.ts
  1. Start the chat interface with Ollama
npm start

Note: Adjust the Ollama host/model in mcp-config.json if your setup differs.

Additional notes

Tips and common issues:

  • Ensure Ollama is running and accessible at the host URL configured in mcp-config.json; the model must support function calling for tool integration.
  • If you modify the tool set, update mcp-config.json accordingly and restart the MCP server.
  • The example uses both uvx (Python) and npx (Node.js) MCP servers; you can mix and match as supported by your environment.
  • The standalone demo mode is useful for verifying tool behavior without involving an LLM.
  • For environment-specific paths or authentication (e.g., restricted web access), set environment variables in a deployment-friendly way and reference them in mcp-config.json if needed.
  • When upgrading tools or models, verify compatibility with the MCP framework and the function calling interface.

Environment variables you might consider:

  • OLLAMA_HOST: Override the Ollama host URL
  • OLLAMA_MODEL: Override the Ollama model name
  • TOOL_TIMEOUT_MS: Adjust per-tool timeouts
  • LOG_LEVEL: Set logging verbosity

Related MCP Servers

Sponsor this space

Reach thousands of developers