mcp
Provides tools to clients over the Model Context Protocol
claude mcp add --transport stdio tcpipuk-mcp-server docker run -i ghcr.io/tcpipuk/mcp-server/server:latest \ --env SSE_HOST="Optional: SSE listening address (default uses stdio if not set)" \ --env SSE_PORT="Optional: SSE listening port (default uses stdio if not set)" \ --env USER_AGENT="Optional: Custom User-Agent string (e.g., MCP-Server/1.0)" \ --env SEARXNG_QUERY_URL="URL for your SearXNG instance's Search API (e.g., http://searxng:8080/ or http://searxng.example.org/search)"
How to use
This MCP server exposes two main capabilities to AI assistants. First, the Search tool lets the assistant query a SearXNG-based search engine to obtain current information, look up specific resources, or perform lightweight calculations via web results. Second, the Web tool enables the assistant to fetch and process content from individual websites, convert pages to markdown for easy reading, retrieve raw page content, or extract links for further exploration. These tools provide clear feedback about what the server did and any errors that occurred, helping users reason about results.
To run and connect with the server, you can use either a Docker-based deployment or run locally via uv. In Docker, you typically expose an SSE (Server-Sent Events) endpoint for networked clients, such as LibreChat. When running locally with uv, you’ll start the server in either networked SSE mode or standard stdio mode for direct 1:1 interactions. Depending on your setup, you’ll configure environment variables like the SearXNG query URL and optional User-Agent, and then connect clients either over SSE or via stdio. The documentation also covers how to wire the MCP server into clients like LibreChat or Claude Desktop.
How to install
Prerequisites:
- Docker installed (for the recommended Docker deployment) or Python with uv installed for local development.
- Access to a SearXNG instance (or compatible search API) for SEARXNG_QUERY_URL.
- Optional: SSE-capable clients (e.g., LibreChat) if you plan to use networked SSE mode.
Install and run with Docker (recommended):
-
Ensure Docker is installed and running.
-
Create a docker-compose.yml or run the container directly:
docker run -i
-e SEARXNG_QUERY_URL=http://searxng:8080
-e SSE_HOST=0.0.0.0
-e SSE_PORT=8080
-e USER_AGENT="MCP-Server/1.0"
ghcr.io/tcpipuk/mcp-server/server:latest -
If using docker-compose, you can define a service named mcp-server similar to the README example, including the environment variables. Start with docker compose up -d.
Install and run locally with uv (Python):
- Install uv (Python 3.13+): curl -LsSf https://astral.sh/uv/install.sh | sh
- Create and activate a virtual environment, then install dependencies from lockfile:
uv venv
source .venv/bin/activate # Linux/macOS
or .venv\Scripts\activate # Windows
uv sync - Set required environment variable for the SearXNG query URL: export SEARXNG_QUERY_URL="http://your-searxng-instance.local:8080" export USER_AGENT="CustomAgent/1.0" # optional
- Run the server in SSE mode:
mcp-server --sse-host 0.0.0.0 --sse-port 3001
For stdio mode (no SSE):
mcp-server
Prerequisites summary:
- docker or uv (Python 3.13+)
- Access to a SearXNG instance via SEARXNG_QUERY_URL
- Optional: networked SSE clients like LibreChat for SSE mode
Additional notes
Notes and tips:
- The SEARXNG_QUERY_URL must point to the Search API endpoint of your SearXNG instance.
- If you enable SSE (via SSE_HOST and SSE_PORT or corresponding env vars), clients like LibreChat can connect over the network. Without SSE, the server defaults to stdio mode.
- The USER_AGENT can help identify requests from the MCP server; customize as needed.
- When using Docker, ensure port mappings align with your SSE_PORT if you intend to expose the SSE endpoint to the host.
- In multi-container environments, ensure the SearXNG service is reachable from the MCP container (networking setup may require a shared network).
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP