ddg_mcp_server
A web search tool and API powered by DuckDuckGo, Gradio, and MCP, providing both a user-friendly web interface and Claude Desktop tool integration. It fetches web search results, extracts summaries, and retrieves the full content of web pages in markdown format.
claude mcp add --transport stdio shgsousa-ddg_mcp_server python main.py \ --env ACCESS_TOKEN="your_api_key_here" \ --env OPENAI_API_URL="https://api.openai.com/v1"
How to use
This MCP server exposes a web-based DuckDuckGo search interface powered by a Python backend and Gradio. It uses DuckDuckGo's search API to fetch real-time results, formats them with Markdown, and offers AI-powered content summarization via an OpenAI-compatible API. The interface is served through a Gradio web app, making it easy to search, view full results, and read concise summaries generated by the summarization feature. You can configure the number of results and ensure the OpenAI-like service is reachable by supplying credentials in the environment.
To use the server, start it with the provided command (for example, running main.py with a Python environment or via Docker). Open your browser to the app URL (typically http://localhost:7860) to perform searches. The API credentials for summarization are provided through environment variables (OPENAI_API_URL and ACCESS_TOKEN). If you have a compatible API endpoint, you can customize the model and endpoint URL via the config file to tailor the summarization behavior. See the SUMMARIZATION.md document referenced in the repo for deeper details on how summaries are generated and configurable models.
How to install
Prerequisites:
- Python 3.10+ or Docker installation
- Git (optional, for cloning)
- Access to a DuckDuckGo API (via the included integration)
- Optional: OpenAI-compatible API endpoint and credentials for summarization
Option 1: Run with Docker
- Ensure Docker is installed and running.
- Build the image (from the project folder):
docker build -t ddg-mcp-server .
- Run the container exposing port 7860:
docker run -p 7860:7860 ddg-mcp-server
- Open http://localhost:7860 to access the app.
Option 2: Run locally with Python
- Clone the repository and navigate to the project folder:
git clone <repository-url>
cd ddg_mcp_server
- Create and activate a virtual environment (optional but recommended):
python -m venv venv
source venv/bin/activate # on macOS/Linux
venv\Scripts\activate # on Windows
- Install dependencies (assumes a requirements.txt is present):
pip install -r requirements.txt
- Copy the example env file and set credentials:
cp .env.example .env
Edit .env to include OPENAI_API_URL and ACCESS_TOKEN as needed. Then run the app:
python main.py
- The app will be available at http://localhost:7860 (adjust if you change the port).
Prerequisites recap: ensure you have Python 3.10+, pip, and network access to DuckDuckGo and the optional OpenAI-compatible API for summarization.
Additional notes
Tips and common issues:
- If you rely on the AI summarization feature, ensure OPENAI_API_URL and ACCESS_TOKEN are correctly set in the environment or in .env. The default API URL can be the official OpenAI endpoint unless overridden.
- When running in Docker, pass environment variables through -e OPENAI_API_URL=... -e ACCESS_TOKEN=... to enable summarization.
- The server listens on port 7860 by default; adjust port mappings if you need to expose it elsewhere.
- If you encounter connectivity issues, verify that the container or local process has network access to DuckDuckGo and the OpenAI API and that API keys are valid.
- The configuration for summarization model can be changed in the config.py file; refer to SUMMARIZATION.md for details on supported models and settings.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP