mcp -browser-use-ollama
MCP server from Cam10001110101/mcp-server-browser-use-ollama
claude mcp add --transport stdio cam10001110101-mcp-server-browser-use-ollama python /path/to/src/server.py \ --env OLLAMA_HOST="Ollama API endpoint (default: http://localhost:11434)" \ --env OLLAMA_MODEL="Specify Ollama model (default: qwen3)"
How to use
This MCP server provides a browser automation interface powered by Ollama-hosted local models. It exposes a set of browser control tools via MCP that let AI agents drive Playwright to open pages, click elements, type text, scroll, extract data, take screenshots, and manage multiple sessions. The system supports interactive dialogue with Ollama for dynamic decision-making and can operate in two modes: direct MCP integration (for Claude Desktop or similar environments) or Ollama-driven automation where you issue natural language tasks and the client/server pair translates them into browser actions. Use the example commands to start a session and then send tasks through the client to control a headless or visible browser session.
How to install
Prerequisites:
- Python 3.8+
- Ollama installed and running
- uv package manager (recommended, install via uv) or install via pip
Installation steps:
-
Clone the repository: git clone https://github.com/Cam10001110101/mcp-server-browser-use-ollama cd mcp-server-browser-use-ollama
-
Install in editable mode with uv (recommended): uv pip install -e . playwright install
-
Start Ollama and pull a model (in separate terminals): ollama serve ollama pull qwen3
-
Run tests (optional): pytest tests/test_server_mcp.py -v
Notes:
- Ensure Ollama is running and accessible at the default host/port or set OLLAMA_HOST accordingly.
- The server.py script is the MCP server entry point for browser automation via Playwright.
Additional notes
Tips and common issues:
- If you encounter connection errors to Ollama, verify OLLAMA_HOST and that Ollama is listening on the specified port.
- For deterministic outputs, set temperature=0 in your prompt or model configuration as supported by your client integration.
- Browser sessions are automatically cleaned up; ensure you close sessions explicitly when done to free resources.
- You can switch between different Ollama models by changing the OLLAMA_MODEL environment variable.
- Ensure Playwright browsers are installed (playwright install) when running locally.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP