researcher_agent
An application built on the Model Context Protocol (MCP) that transforms any website into highly relevant content based on your queries. The app seamlessly integrates with platforms like X, Slack, and among others.
claude mcp add --transport stdio lgesuellip-researcher_agent node path/to/server.js \ --env LOG_LEVEL="info" \ --env ARCade_API_KEY="your-arcade-api-key" \ --env OPENAI_API_KEY="your-openai-api-key" \ --env LANGCHAIN_API_KEY="your-langchain-api-key" \ --env FIRECRAWLL_API_KEY="your-firecrawll-api-key"
How to use
researcher_agent is an MCP server designed to transform any website into highly relevant content driven by your queries. It leverages Firecrawll for site mapping, intelligent selection, and scraping, and uses LangGraph as the MCP client to coordinate tasks. The Arcades integration enables seamless posting to platforms like X and Slack, while LangChainAI LangSmith provides tracing and structured, reliable outputs from OpenAI. Typical use cases include creating LLM-ready text files, indexing and organizing documentation, and automating research tasks. To get started, deploy the server in your environment, ensure you have your API keys configured, and connect your preferred platforms through Arcade. Once running, you can initiate web research, generate structured outputs, and export results in formats tailored for downstream LLM consumption.
How to install
Prerequisites:
- Node.js 18+ and npm installed on your system
- Access keys for OpenAI, LangChain/LangSmith, Arcade, and Firecrawll (as applicable)
- A Git-based source for the MCP server repository
Installation steps:
-
Clone the MCP server repository: git clone https://example.com/your-repo/researcher_agent.git cd researcher_agent
-
Install dependencies: npm install
-
Configure environment variables:
- Create a .env file or set environment variables in your deployment environment with: OPENAI_API_KEY=your-openai-api-key ARCADE_API_KEY=your-arcade-api-key LANGCHAIN_API_KEY=your-langchain-api-key FIRECRAWLL_API_KEY=your-firecrawll-api-key LOG_LEVEL=info
-
Run the server: node path/to/server.js
Or use your preferred process manager (e.g., pm2, systemd) for production
-
Verify the server is running by checking the logs and ensuring health endpoints (if provided) respond successfully.
Additional notes
Notes and tips:
- Environment variables are critical for authentication and feature toggles; keep them secure and do not commit them to version control.
- Firecrawll handles site mapping and scraping; adjust crawl depth and rate limits to balance thoroughness with politeness.
- Arcade integration allows posting results to platforms like X or Slack; ensure permissions are configured for the target platforms.
- If you encounter OpenAI rate limits, consider implementing exponential backoff and/or batching requests.
- LangSmith tracing helps debug and monitor flows; enable tracing in development to capture structured outputs and errors.
- The server is designed for async processing; expect non-blocking behavior and potential retries on transient failures.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP