Get the FREE Ultimate OpenClaw Setup Guide →

requesthunt

Scanned
npx machina-cli add skill ReScienceLab/opc-skills/requesthunt --openclaw
Files (1)
SKILL.md
3.7 KB

RequestHunt Skill

Generate user demand research reports by collecting and analyzing real user feedback from Reddit, X (Twitter), and GitHub.

Prerequisites

Set API key in ~/.zshrc:

export REQUESTHUNT_API_KEY="your_api_key"

Get your key from: https://requesthunt.com/settings/api

Quick Check:

cd <skill_directory>
python3 scripts/get_usage.py

Research Workflow

This skill helps you generate comprehensive user demand research reports. Follow this workflow:

Step 1: Define Scope

Before collecting data, clarify with the user:

  1. Research Goal: What domain/area to investigate? (e.g., AI coding assistants, project management tools)
  2. Specific Products: Any products/competitors to focus on? (e.g., Cursor, GitHub Copilot)
  3. Platform Preference: Which platforms to prioritize? (reddit, x, github)
  4. Time Range: How recent should the feedback be?
  5. Report Purpose: Product planning / competitive analysis / market research?

Step 2: Collect Data

# 1. Trigger realtime scrape for the topic
python3 scripts/scrape_topic.py "ai-coding-assistant" --platforms reddit,x,github

# 2. Search with expansion for more data
python3 scripts/search_requests.py "code completion" --expand --limit 50

# 3. List requests filtered by topic
python3 scripts/list_requests.py --topic "ai-tools" --limit 100

Step 3: Generate Report

Analyze collected data and generate a structured Markdown report:

# [Topic] User Demand Research Report

## Overview
- Scope: ...
- Data Sources: Reddit (X), X (Y), GitHub (Z)
- Time Range: ...

## Key Findings

### 1. Top Feature Requests
| Rank | Request | Sources | Representative Quote |
|------|---------|---------|---------------------|

### 2. Pain Points Analysis
- **Pain Point A**: ...

### 3. Competitive Comparison (if specified)
| Feature | Product A | Product B | User Expectations |

### 4. Opportunities
- ...

## Methodology
Based on N real user feedbacks collected via RequestHunt...

Commands

All commands run from the skill directory.

List Requests

python3 scripts/list_requests.py --limit 20                    # Recent requests
python3 scripts/list_requests.py --topic "ai-tools" --limit 10 # By topic
python3 scripts/list_requests.py --platforms reddit,github     # By platform
python3 scripts/list_requests.py --category "Developer Tools"  # By category
python3 scripts/list_requests.py --sortBy top --limit 20       # Top voted

Search Requests

python3 scripts/search_requests.py "authentication" --limit 20
python3 scripts/search_requests.py "oauth" --expand            # With realtime expansion
python3 scripts/search_requests.py "API rate limit" --expand --platforms reddit,x

Get Topics

python3 scripts/get_topics.py                                  # List all topics by category

Check Usage

python3 scripts/get_usage.py                                   # View API usage stats

Scrape Topic (Realtime)

python3 scripts/scrape_topic.py "developer-tools"              # Default: reddit,x
python3 scripts/scrape_topic.py "ai-assistant" --platforms reddit,x,github

API Info

Source

git clone https://github.com/ReScienceLab/opc-skills/blob/main/skills/requesthunt/SKILL.mdView on GitHub

Overview

RequestHunt generates comprehensive user demand research reports by collecting and analyzing feedback from Reddit, X (Twitter), and GitHub. It helps teams identify top feature requests, pain points, and market signals for product planning and competitive analysis.

How This Skill Works

The workflow follows three steps: define scope (goal, products, platforms, time range, report purpose), collect data using provided scripts to scrape topics, search requests, and list requests, then generate a structured Markdown report with findings and opportunities.

When to Use It

  • You want to conduct demand research or analyze user demand from real feedback
  • You need to find feature requests from Reddit, X, and GitHub for a product area
  • You’re performing competitive analysis and need user expectations relative to rivals
  • You want a structured demand report to share with stakeholders or for roadmapping
  • You plan to monitor and update demand trends by running RequestHunt queries over time

Quick Start

  1. Step 1: Define scope (goal, products, platforms, time range, report purpose)
  2. Step 2: Collect data using the provided scripts to scrape_topic, search_requests, and list_requests
  3. Step 3: Generate the report and deliver a structured Markdown document with findings and opportunities

Best Practices

  • Clarify research goals, scope, and target products before collecting data
  • Prioritize platforms based on where your target users engage (Reddit, X, GitHub)
  • Use topic filters and a defined time range to keep data relevant
  • Use expansion searches sparingly to balance breadth with signal
  • Document methodology and cite sources in the final report for credibility

Example Use Cases

  • Example: AI coding assistants—aggregate top feature requests from Reddit, X, and GitHub to inform roadmap
  • Example: Developer tools—identify recurring pain points and unmet needs across platforms
  • Example: Competitive analysis—compare user expectations of Copilot vs. Cursor with source-backed findings
  • Example: Market demand—spot high-priority capabilities to seed upcoming releases
  • Example: Real-time demand monitoring—regularly scrape topics to track shifting user interest

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers