open-webui
Simple MCP server to access your knowledge in Open WebUI
claude mcp add --transport stdio ronasit-open-webui-mcp-server python mcp_server.py \ --env MCP_HTTP_PORT="8001" \ --env MCP_TRANSPORT="stdio" \ --env MCP_CORS_ORIGINS="" \ --env OPEN_WEBUI_API_URL="https://your-open-webui-instance.com/api/v1" \ --env OPEN_WEBUI_API_TOKEN="sk-your-token-here" \ --env MCP_RATE_LIMIT_HEALTH="10/minute" \ --env MCP_RATE_LIMIT_PER_IP="1000/minute" \ --env MCP_RATE_LIMIT_PER_TOKEN="1000/minute"
How to use
This MCP server exposes the Open WebUI Knowledge Bases as MCP tools, enabling clients like Cursor and Claude Desktop to search and access knowledge bases through a standard MCP interface. It supports two transport modes: stdio for local usage and HTTP for remote access. Available tools include list_knowledge_bases to enumerate bases, search_knowledge_base to perform semantic searches within a knowledge base, and get_knowledge_base_info to fetch metadata about a base. Configure the server to point to your Open WebUI instance using the required API URL and API token, then run the server in your preferred transport mode. You can connect via Cursor in stdio mode by using the provided mcp.json configuration or switch to HTTP mode to expose an HTTP endpoint at /mcp and /health for health checks. Ensure your environment variables are set so the server can authenticate to Open WebUI and respect rate limits and CORS as needed.
How to install
Prerequisites:
- Python 3.8+ (or Docker as an alternative deployment)
- Access to an Open WebUI instance with API access
- A valid Open WebUI API token
Installation steps (local Python):
# Install Python dependencies
pip install -r requirements.txt
# Optional: install uv (for uvx usage as shown in docs)
pip install uv # or: brew install uv
Run in stdio mode (local):
export OPEN_WEBUI_API_URL="https://your-open-webui-instance.com/api/v1"
export OPEN_WEBUI_API_TOKEN="sk-your-token-here"
export MCP_TRANSPORT="stdio"
python mcp_server.py
Run in HTTP mode (production) using Docker Compose as in the docs:
# Create .env file
echo "OPEN_WEBUI_API_URL=https://your-open-webui-instance.com/api/v1" > .env
# Start server (HTTP transport will be used by default with proper configuration)
docker-compose up -d
# View logs
docker-compose logs -f
Alternatively, run directly with HTTP transport:
export OPEN_WEBUI_API_URL="https://your-open-webui-instance.com/api/v1"
export MCP_TRANSPORT="http"
export MCP_HTTP_PORT="8001"
python mcp_server.py
Additional notes
- The server requires access to your Open WebUI API and should be run in an environment that can reach that API endpoint.
- When using HTTP transport, ensure network security (TLS termination, API key protection) and update MCP_HTTP_PORT if necessary.
- The provided environment variables control rate limiting and CORS. Adjust MCP_RATE_LIMIT_* values to suit your load.
- If you encounter connectivity issues, verify OPEN_WEBUI_API_URL and OPEN_WEBUI_API_TOKEN are correct and that the Open WebUI instance is reachable from the MCP server host.
- For Cursor integration in stdio mode, you can provide an mcp.json file with the exact command/args and env variables as shown in the README to launch the server from within Cursor's environment.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP