SearXNG Self-Host
Verified@saikatkumardey
npx machina-cli add skill @saikatkumardey/searxng-selfhost --openclawSearXNG
SearXNG is a self-hosted search aggregator that queries Google, Bing, Brave, Startpage, DuckDuckGo, and Wikipedia simultaneously. No API keys required. Results returned as JSON.
Quick start (already installed)
python3 scripts/search.py "your query" # human-readable
python3 scripts/search.py "your query" --json # JSON (for parsing)
python3 scripts/search.py "query" --count 5 --json # limit + JSON
Place search.py anywhere convenient — typically tools/search.py in the workspace.
For detailed usage patterns and service management: see references/usage.md.
Installation (new instance)
Run as root on Ubuntu 22.04/24.04:
bash scripts/install_searxng.sh
This installs SearXNG, creates a searxng system user, writes /etc/searxng/settings.yml, and starts a systemd service on http://127.0.0.1:8888.
Verify:
curl 'http://127.0.0.1:8888/search?q=test&format=json' | python3 -m json.tool | head -20
systemctl status searxng
Connecting search.py to the instance
search.py targets http://127.0.0.1:8888 by default. If the port differs, update SEARXNG_URL at the top of the script.
Fallback
If SearXNG is down, search.py falls back to Wikipedia + GitHub APIs automatically. No action needed — results still return, just from narrower sources.
Troubleshooting
| Symptom | Fix |
|---|---|
[SearXNG unavailable] in stderr | systemctl restart searxng |
| Port conflict on 8888 | Change port: in /etc/searxng/settings.yml + update SEARXNG_URL in script |
| Empty results from all engines | Check /etc/searxng/settings.yml engines block; restart service |
| Connection refused | Service not running — systemctl start searxng |
Overview
SearXNG is a self-hosted search aggregator that queries Google, Bing, Brave, Startpage, DuckDuckGo, and Wikipedia simultaneously, returning results as JSON with no API keys. This skill covers installing on a VPS, configuring, using the search.py CLI, and handling fallback behavior so OpenClaw agents can search the web privately and reliably.
How This Skill Works
Install on Ubuntu (22.04/24.04) using the provided script, which creates a searxng system user, writes /etc/searxng/settings.yml, and starts a systemd service on http://127.0.0.1:8888. The agent's search.py client targets this local instance by default, and you must keep SEARXNG_URL in sync if you run on a different port. If SearXNG is down, search.py automatically falls back to Wikipedia and GitHub APIs.
When to Use It
- When the agent needs web search capability without external API keys
- When setting up a new OpenClaw instance that requires a private search backend
- When you want to run web searches using an existing local SearXNG instance
- When diagnosing search failures or connectivity issues (port, settings, service status)
- When you want a graceful fallback if SearXNG is unavailable
Quick Start
- Step 1: Run as root on Ubuntu 22.04/24.04: bash scripts/install_searxng.sh
- Step 2: Verify service is running and accessible at http://127.0.0.1:8888 using curl and systemctl status searxng
- Step 3: Ensure search.py connects to the local instance by keeping SEARXNG_URL in sync (or update it if you change the port)
Best Practices
- Install on Ubuntu 22.04/24.04 as root to ensure systemd integration
- Secure and version-control /etc/searxng/settings.yml, especially engines and proxies
- Keep SEARXNG_URL in search.py in sync with the actual local port
- Test JSON output with python3 -m json.tool for reliable parsing
- Verify downtime fallback by confirming results still return from Wikipedia + GitHub APIs when SearXNG is down
Example Use Cases
- A VPS-based OpenClaw agent uses SearXNG to fetch live web results without API keys
- An OpenClaw instance uses an existing local SearXNG for searches
- Troubleshooting: restarting searxng with systemctl after a failure
- Querying the instance with curl to verify /search JSON output
- Reconfiguring the port (e.g., 8889) and updating SEARXNG_URL accordingly