Get the FREE Ultimate OpenClaw Setup Guide →

mcp -browser-use-ollama

MCP server from Cam10001110101/mcp-server-browser-use-ollama

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio cam10001110101-mcp-server-browser-use-ollama python /path/to/src/server.py \
  --env OLLAMA_HOST="Ollama API endpoint (default: http://localhost:11434)" \
  --env OLLAMA_MODEL="Specify Ollama model (default: qwen3)"

How to use

This MCP server provides a browser automation interface powered by Ollama-hosted local models. It exposes a set of browser control tools via MCP that let AI agents drive Playwright to open pages, click elements, type text, scroll, extract data, take screenshots, and manage multiple sessions. The system supports interactive dialogue with Ollama for dynamic decision-making and can operate in two modes: direct MCP integration (for Claude Desktop or similar environments) or Ollama-driven automation where you issue natural language tasks and the client/server pair translates them into browser actions. Use the example commands to start a session and then send tasks through the client to control a headless or visible browser session.

How to install

Prerequisites:

  • Python 3.8+
  • Ollama installed and running
  • uv package manager (recommended, install via uv) or install via pip

Installation steps:

  1. Clone the repository: git clone https://github.com/Cam10001110101/mcp-server-browser-use-ollama cd mcp-server-browser-use-ollama

  2. Install in editable mode with uv (recommended): uv pip install -e . playwright install

  3. Start Ollama and pull a model (in separate terminals): ollama serve ollama pull qwen3

  4. Run tests (optional): pytest tests/test_server_mcp.py -v

Notes:

  • Ensure Ollama is running and accessible at the default host/port or set OLLAMA_HOST accordingly.
  • The server.py script is the MCP server entry point for browser automation via Playwright.

Additional notes

Tips and common issues:

  • If you encounter connection errors to Ollama, verify OLLAMA_HOST and that Ollama is listening on the specified port.
  • For deterministic outputs, set temperature=0 in your prompt or model configuration as supported by your client integration.
  • Browser sessions are automatically cleaned up; ensure you close sessions explicitly when done to free resources.
  • You can switch between different Ollama models by changing the OLLAMA_MODEL environment variable.
  • Ensure Playwright browsers are installed (playwright install) when running locally.

Related MCP Servers

Sponsor this space

Reach thousands of developers