Get the FREE Ultimate OpenClaw Setup Guide →

Mastervolt-Deep-Research

Mastervolt Deep Research is a sophisticated multi-agent orchestration system built on VoltAgent that automates complex research workflows. It combines specialized AI agents, semantic memory, intelligent tooling, and custom web scraping to conduct comprehensive research, verify facts, analyze data, and generate publication-ready reports.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio ssdeanx-mastervolt-deep-research node server.js \
  --env TURSO_URL="optional" \
  --env SUPABASE_KEY="optional" \
  --env SUPABASE_URL="optional" \
  --env TURSO_AUTH_TOKEN="optional" \
  --env GOOGLE_GENERATIVE_AI_API_KEY="your_google_generative_ai_api_key_here"

How to use

Mastervolt Deep Research is an enterprise-grade, multi-agent research orchestration system built on VoltAgent. It coordinates a network of specialized agents to automate complex research workflows, leveraging semantic memory, custom web scraping, and financial analysis tooling to produce publication-ready reports. When you deploy it as an MCP server, the Plan Agent supervises a suite of 14+ specialized agents (such as Assistant, Writer, Scrapper, Data Analyzer, Fact Checker, Synthesizer, Coding, and Data Scientist) to perform tasks like data gathering, analysis, fact verification, and report generation. The system also includes a robust memory layer using LibSQL-backed vector storage and Google embeddings, a dedicated web scraper toolkit, and a rich UI for viewing outputs through 49 specialized components. The API layer exposes chat and messaging endpoints, enabling an interactive, streaming chat interface that feeds the agents and presents structured results.

How to install

Prerequisites:

  • Node.js 18+ (and npm or pnpm/yarn)
  • Git
  • Google Generative AI API key (for AI capabilities)

Installation steps:

  1. Clone the repository
  2. Install dependencies
  3. Configure environment variables

Code blocks:

Clone the repository

git clone https://github.com/ssdeanx/Mastervolt-Deep-Research.git cd Mastervolt-Deep-Research

Install dependencies

npm install

Set up environment variables

cp .env.example .env

Edit .env with your keys (example placeholders shown below)

GOOGLE_GENERATIVE_AI_API_KEY='your_google_generative_ai_api_key_here'

Optional integrations

SUPABASE_URL='your_supabase_url'

SUPABASE_KEY='your_supabase_key'

TURSO_URL='your_turso_url'

TURSO_AUTH_TOKEN='your_turso_token'

Configuration (example for local dev):

When running via MCP, you typically start through npm scripts, e.g.:

Development mode with auto-reload (Agents)

npm run dev

Start Next.js UI (Chat Interface)

npm run next

Run both concurrently

npm run dev:test

Build for production

npm run build

Start production server

npm start

Additional notes

Tips and common notes:

  • Ensure you have a valid Google Generative AI API key added to the environment for AI capabilities.
  • If you use optional integrations (Supabase, Turso), populate their URLs/keys in the environment or in the .env file.
  • The MCP server expects Node.js 18+ and should be run with the provided npm scripts (dev, next, dev:test, build, start).
  • When debugging, check OpenTelemetry/VoltOps traces to diagnose agent coordination issues and memory bottlenecks in the LibSQL vector store.
  • If you encounter port conflicts for the Next.js UI (default 3000), adjust the PORT setting in the environment or scripts.
  • Ensure the MCP server’s command path (server.js) matches your deployment layout; if your entry point differs, adjust the mcp_config accordingly.

Related MCP Servers

casibase

4.5k

⚡️AI Cloud OS: Open-source enterprise-level AI knowledge base and MCP (model-context-protocol)/A2A (agent-to-agent) management platform with admin UI, user management and Single-Sign-On⚡️, supports ChatGPT, Claude, Llama, Ollama, HuggingFace, etc., chat bot demo: https://ai.casibase.com, admin UI demo: https://ai-admin.casibase.com

solace-agent-mesh

2.1k

An event-driven framework designed to build and orchestrate multi-agent AI systems. It enables seamless integration of AI agents with real-world data sources and systems, facilitating complex, multi-step workflows.

a2a-x402

460

The A2A x402 Extension brings cryptocurrency payments to the Agent-to-Agent (A2A) protocol, enabling agents to monetize their services through on-chain payments. This extension revives the spirit of HTTP 402 "Payment Required" for the decentralized agent ecosystem.

eion

144

Shared Memory Storage for Multi-Agent Systems

Unified -Tool-Graph

27

Instead of dumping 1000+ tools into a model’s prompt and expecting it to choose wisely, the Unified MCP Tool Graph equips your LLM with structure, clarity, and relevance. It fixes tool confusion, prevents infinite loops, and enables modular, intelligent agent workflows.

AgentStack

18

AgentStack is a production-grade multi-agent framework built on Mastra, delivering 50+ enterprise tools, 25+ specialized agents, and A2A/MCP orchestration for scalable AI systems. Focuses on financial intelligence, RAG pipelines, observability, and secure governance. ACP Openclaw, Gemini CLI, Opencode

Sponsor this space

Reach thousands of developers