kindly-web-search
Kindly Web Search MCP Server: Web search + robust content retrieval for AI coding tools (Claude Code, Codex, Cursor, GitHub Copilot, Gemini, etc.) and AI agents (Claude Desktop, OpenClaw, etc.). Supports Serper, Tavily, and SearXNG.
claude mcp add --transport stdio shelpuk-ai-technology-consulting-kindly-web-search-mcp-server uvx kindly-web-search-mcp-server \ --env GITHUB_TOKEN="token with public repo access (optional)" \ --env SERPER_API_KEY="your Serper API key (optional)" \ --env TAVILY_API_KEY="your Tavily API key (optional)" \ --env SEARXNG_BASE_URL="base URL for self-hosted SearXNG (optional)"
How to use
Once running, you can query the Kindly Web Search MCP server through your MCP client or orchestrator using the provided tools. Use web_search(query, num_results) to fetch top results with structured fields such as title, link, snippet, and page_content (Markdown, best-effort). Use get_content(url) to retrieve the full page content (Markdown, best-effort) for deeper context. The server relies on Serper as the primary search provider when an API key is configured, with Tavily as a fallback, and can fall back to a self-hosted SearXNG setup via SEARXNG_BASE_URL if configured. For page extraction, a Chromium-based browser must be installed on the same machine, enabling universal page_content extraction, though specialized sources (like StackExchange, GitHub Issues, Wikipedia, arXiv) can still work without the browser. Optional GitHub token (GITHUB_TOKEN) improves the formatting and depth of GitHub Issues results by rendering questions, answers, comments, reactions, and metadata in a more LLM-friendly way.
How to install
Prerequisites
- A machine with Python 3.13+ (3.14 supported; optional extras may be disabled on 3.14)
- A Chromium-based browser installed on the same machine running the MCP client (Chrome/Chromium/Edge/Brave)
- An API key for a preferred search provider (SERPER_API_KEY) or Tavily API key (TAVILY_API_KEY) or self-hosted SEARXNG base URL (SEARXNG_BASE_URL)
- Optional GITHUB_TOKEN for improved GitHub content rendering
Step 1: Install uvx (Python-based MCP runner)
- macOS / Linux:
curl -LsSf https://astral.sh/uv/install.sh | sh
- Windows (PowerShell):
irm https://astral.sh/uv/install.ps1 | iex
- Verify installation:
uvx --version
Step 2: Install a Chromium-based browser (required for full page_content extraction)
- macOS:
brew install --cask chromium
- Windows: Install Chrome or Edge (and note the path if needed by your setup)
- Linux (Ubuntu/Debian): ensure Chrome/Chromium is installed via your package manager or from the official sources
Step 3: Install and run the Kindly Web Search MCP server
uvx run kindly-web-search-mcp-server
- If you prefer to pre-install the package and then run explicitly, you can do:
uvx install kindly-web-search-mcp-server
uvx run kindly-web-search-mcp-server
Step 4: Configure environment variables (optional but recommended)
- Set API keys and base URL as needed (see mcp_config for details) and restart the server if you change them.
Step 5: Verify the server is running
- Check the startup logs for the listening address/port and try a simple web_search query from your MCP client.
Additional notes
Tips and considerations:
- Provide SERPER_API_KEY for best results; Tavily is a solid fallback. If neither is configured, SEARXNG_BASE_URL can be used for a self-hosted search setup.
- Enabling GITHUB_TOKEN often yields richer GitHub Issue results (questions, answers, comments, reactions, metadata) and can improve usefulness in coding contexts.
- For universal page_content extraction, ensure a compatible Chromium-based browser is accessible on the same host as the MCP client.
- If you encounter rate limits, rotate API keys or add a GitHub token to increase quota. Monitor logs for browser automation hints if page_content extraction fails on certain sites.
- The server exposes high-quality content in a single call (no need for a separate scraping step), which helps reduce token usage when driving AI coding assistants.
Related MCP Servers
gemini-cli
An open-source AI agent that brings the power of Gemini directly into your terminal.
ai-agent-team
AI Agent Team-拥有24/7专业AI开发团队:产品经理、前端开发、后端开发、测试工程师、DevOps工程师、技术负责人。一键安装,支持中英文命令,大幅提升开发效率!
mcpcat-python-sdk
MCPcat is an analytics platform for MCP server owners 🐱.
gopls
MCP server for golang projects development: Expand AI Code Agent ability boundary to have a semantic understanding and determinisic information for golang projects.
lad_mcp_server
Lad MCP Server: Autonomous code & system design review for AI coding agents (Claude Code, Cursor, Codex, etc.). Features multi-model consensus via OpenRouter and context-aware reviews via Serena.
HydraMCP
Connect agents to agents. MCP server for querying any LLM through your existing subscriptions: compare, vote, and synthesize across GPT, Gemini, Claude, and local models from one terminal.