mcp-aoai-web-browsing
A minimal Model Context Protocol 🖥️ server/client🧑💻with Azure OpenAI and 🌐 web browser control via Playwright.
claude mcp add --transport stdio kimtth-mcp-aoai-web-browsing uv run chatgui.py \ --env AZURE_OPEN_AI_API_KEY="Azure OpenAI API key" \ --env AZURE_OPEN_AI_ENDPOINT="Azure OpenAI endpoint URL" \ --env AZURE_OPEN_AI_API_VERSION="API version (optional)" \ --env AZURE_OPEN_AI_DEPLOYMENT_MODEL="Deployment model name"
How to use
This MCP server provides a bridge to Azure OpenAI, exposing a web-browsing-capable toolset via the MCP protocol. It uses Playwright under the hood to navigate web pages and interfaces with Azure OpenAI to produce function-call-like outputs compatible with OpenAI's function calling format. The server is implemented with FastMCP and integrates an MCP-LLM bridge so that MCP responses describing tools can be consumed by an OpenAI-compatible LLM. To use it, run the server via UV in Python, configure your Azure OpenAI credentials, and invoke the exposed MCP tools (for example, navigating to a URL through the provided Playwright-based tool). This setup enables secure, local browsing interactions driven by an LLM.
How to install
Prerequisites:
- Python installed on your system
- UV installed for Python dependency management (pip install uv)
- Azure OpenAI resources (endpoint, API key, deployment model) prior to running
Step-by-step:
-
Clone or download the MCP server repository.
-
Rename the example environment template if present and create a real .env file, or directly set the environment variables (see env placeholders in this guide): AZURE_OPEN_AI_ENDPOINT= AZURE_OPEN_AI_API_KEY= AZURE_OPEN_AI_DEPLOYMENT_MODEL= AZURE_OPEN_AI_API_VERSION=
-
Install UV and sync dependencies: pip install uv uv sync
-
Start the MCP server using UV (this runs the specified Python script): uv run chatgui.py
-
Ensure the server starts successfully and the client UI is accessible. If you see browser-related errors, you may need to install or configure Playwright dependencies as prompted by UV/Playwright integration.
Notes:
- The README describes running via uv run chatgui.py; adjust the script name if your entrypoint differs.
- Keep Azure credentials secure and do not commit them to source control.
Additional notes
Tips and common notes:
- Make sure to rename or create the .env file from the template and fill in Azure OpenAI details before starting the server.
- The MCP bridge converts MCP tool descriptions to OpenAI function calling formats, so ensure your LLM prompts expect this structure.
- If the Python process needs to be restarted, you can use the provided taskkill command for Python on Windows: taskkill /IM python.exe /F
- Playwright is used for browser automation; ensure your environment supports launching a browser (headless or headed) depending on your debugging needs.
- If you modify dependencies, run uv sync to install any new requirements.
- The server relies on the MCP protocol; verify your client configuration aligns with MCP tool outputs (names, descriptions, and input schemas).
- If you encounter environment issues, check that the Azure API version and deployment model match what your Azure OpenAI resource expects.
Related MCP Servers
pluggedin-app
The Crossroads for AI Data Exchanges. A unified, self-hostable web interface for discovering, configuring, and managing Model Context Protocol (MCP) servers—bringing together AI tools, workspaces, prompts, and logs from multiple MCP sources (Claude, Cursor, etc.) under one roof.
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
MCPSecBench
MCPSecBench: A Systematic Security Benchmark and Playground for Testing Model Context Protocols
mcp-manager
CLI tool for managing Model Context Protocol (MCP) servers in one place & using them across them different clients
mcp-community
Easily run, deploy, and connect to MCP servers