mcp-deep-search
Runs a search across all major providers simultaneously
claude mcp add --transport stdio just-every-mcp-deep-search npx -y @just-every/mcp-deep-search \ --env ENV_FILE="/path/to/.llm.env"
How to use
MCP Deep Search exposes a dedicated server that can perform advanced web searches across multiple providers through a single interface. Built as an MCP server backed by the @just-every/mcp-deep-search package, it integrates with your MCP framework to handle search requests and return results from several providers such as Brave, Google, Anthropic, OpenAI, and others. The server exposes tools like deep_search for provider-specific web queries and comprehensive_research for AI-assisted multi-engine investigations. You can query with a search string and optionally specify a provider or request more results. The CLI component (mcp-deep-search) is designed to be invoked via npx within your MCP configuration or by your MCP client, enabling you to fetch results directly within your workflow.
To use it in your MCP setup, configure the server in your MCP config (see mcp_config) and then issue commands through your MCP client. For example, you can perform a deep_search across a chosen provider, or run comprehensive_research to leverage AI agents that coordinate across multiple search engines for deeper insights. Ensure you supply the necessary API keys via environment variables for the providers you enable (e.g., BRAVE_API_KEY, OPENAI_API_KEY, GOOGLE_API_KEY, etc.).
How to install
Prerequisites:
- Node.js and npm installed on your system
- Access to an environment file (.llm.env) containing API keys for the selected providers
Step 1: Install Node.js and npm
- Mac/Linux: use your package manager or download from nodejs.org
- Windows: install via the Node.js installer
Step 2: Acquire an environment file with API keys
- Create or download your .llm.env (example: ~/.llm.env) and populate keys for the providers you plan to use (BRAVE_API_KEY, ANTHROPIC_API_KEY, OPENAI_API_KEY, GOOGLE_API_KEY, OPENROUTER_API_KEY, XAI_API_KEY, etc.).
Step 3: Run the MCP Deep Search server
- You can run via npx (as intended by this MCP server):
# Using a local environment file path
ENV_FILE=/path/to/.llm.env npx -y @just-every/mcp-deep-search
Step 4: Integrate into your MCP configuration
- Add the server to your MCP config (see mcp_config) and restart your MCP system. If you are using a specific environment file, ensure the path is correct and accessible by the MCP runtime.
Optional Step: Debugging
- To test the server standalone, run:
ENV_FILE=/path/to/.llm.env npx @just-every/mcp-deep-search
- Watch logs for API key issues or provider errors and verify ENV_FILE contents.
Additional notes
Tips and notes:
- Ensure API keys are correctly set in ENV_FILE and that the file path is absolute when passed to clients (ENV_FILE).
- The server supports automatic restart by default for reliability; see your MCP framework settings for restart behavior.
- If you encounter 'missing API keys' errors, verify that the corresponding KEY variables exist in ENV_FILE and that the provider is enabled in the request.
- For debugging, you can run the underlying mcp-deep-search command directly to isolate issues: ENV_FILE=/path/to/.env npx @just-every/mcp-deep-search
- When using the claude or other MCP clients, pass environment variables correctly with -e before -- as demonstrated in the README to avoid accidental leakage of keys as command-line arguments.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
mem0
✨ mem0 MCP Server: A memory system using mem0 for AI applications with model context protocl (MCP) integration. Enables long-term memory for AI agents as a drop-in MCP server.
time
⏰ Time MCP Server: Giving LLMs Time Awareness Capabilities
rod
Model Context Protocol Server of Rod
mcp-human
Human Assistance for AI Assistants
website-publisher
AI Website builder and publisher MCP. Quickly publish and deploy your AI generated code as real website URL. Support html, css, js, python etc.