Get the FREE Ultimate OpenClaw Setup Guide →

fact-check

Scanned
npx machina-cli add skill damionrashford/RivalSearch-Plugin/fact-check --openclaw
Files (1)
SKILL.md
3.3 KB

Fact Check

Verify the following claim: $ARGUMENTS

Follow these steps precisely using RivalSearchMCP tools. Report progress after each step.

Step 1: Claim Decomposition

Parse the claim into individually verifiable components. List each sub-claim and what evidence would confirm or refute it.

Step 2: Primary Source Search

Use web_search to find the origin of the claim:

  • query: "$ARGUMENTS"
  • num_results: 15
  • extract_content: true

Identify where this claim first appeared. Use content_operations to retrieve the original source:

  • operation: "retrieve", url: <source_url>, extraction_method: "markdown"

Step 3: Corroboration Search

Search for independent confirmation:

  • web_search with query: "$ARGUMENTS", num_results: 10
  • web_search with alternative phrasing of the claim, num_results: 10
  • news_aggregation with query: "$ARGUMENTS", max_results: 10, time_range: "month"

Count how many independent sources confirm the claim.

Step 4: Counter-Evidence Search

Actively search for contradicting evidence:

  • web_search with query: "$ARGUMENTS false OR debunked OR incorrect OR misleading", num_results: 10
  • social_search with query: "$ARGUMENTS", platforms: ["reddit", "hackernews"], max_results_per_platform: 10

Look for rebuttals, corrections, retractions, or alternative explanations.

Step 5: Academic Verification

If the claim involves data, statistics, or technical facts:

  • scientific_research with operation: "academic_search", query: "$ARGUMENTS", max_results: 5, sources: ["semantic_scholar", "arxiv"]

Check if peer-reviewed research supports or contradicts the claim.

Step 6: Deep Source Analysis

For the 2-3 most authoritative sources (for and against), use content_operations:

  • operation: "retrieve", url: <source_url>, extraction_method: "markdown"
  • operation: "analyze", content: <retrieved>, analysis_type: "general", extract_key_points: true

Read and assess the quality of each source.

Step 7: Compile Verdict

  1. Claim Under Review — Quote the exact claim
  2. Verdict — One of: Verified / Likely True / Unverified / Disputed / Likely False / False
  3. Confidence Score — High / Medium-High / Medium / Medium-Low / Low
  4. Evidence For — Sources supporting the claim with inline citations
  5. Evidence Against — Sources contradicting the claim with inline citations
  6. Primary Source Analysis — What the original source actually says
  7. Context & Nuance — Important context that affects interpretation
  8. Component Verdicts — If multiple sub-claims, verdict on each
  9. Sources — Complete list of all URLs consulted

Use clean markdown. Every factual statement must cite its source with Source Name format.

Source

git clone https://github.com/damionrashford/RivalSearch-Plugin/blob/main/skills/fact-check/SKILL.mdView on GitHub

Overview

Fact-check verifies claims by cross-referencing web, news, academic, and social sources. It then produces a confidence-scored verdict with a full evidence chain to help validate statements before sharing or acting on them.

How This Skill Works

The process begins with decomposing the claim into verifiable sub-claims. RivalSearchMCP tools are used to locate primary sources, retrieve and analyze content, and gather corroborating and counter-evidence. Finally, a structured verdict with context, sources, and a confidence score is produced.

When to Use It

  • Verifying factual statements in journalism or content creation before publication
  • Debunking or confirming political claims or policy proposals
  • Validating statistics and data in reports, briefs, or research
  • Fact-checking viral posts or rumors on social media
  • Assessing technical or scientific claims that require sources and evidence

Quick Start

  1. Step 1: Claim Decomposition — break the claim into verifiable sub-claims and identify needed evidence.
  2. Step 2: Primary Source Search — use web_search to locate origin and retrieve the source content; retrieve original source with content_operations.
  3. Step 3: Compile Verdict — assess corroboration and counter-evidence, then issue a structured verdict with sources and context.

Best Practices

  • Decompose the claim into sub-claims and specify what evidence would confirm or refute each
  • Prioritize primary sources and origin documents to establish provenance
  • Seek independent corroboration from multiple, diverse sources
  • Actively search for counter-evidence, corrections, or retractions
  • Document evidence with clear citations and report a transparent confidence level

Example Use Cases

  • Claim: A new drug X cures Condition Y in mice. Fact-checking reveals no replicated human trials; verdict: Unverified.
  • Claim: App Z reduces smartphone screen time by 60% in a week. Marketing study lacks independent replication; verdict: Disputed.
  • Claim: Policy Y will save $1B annually. Government report source verified; verdict: Verified.
  • Claim: 5G causes illness. Widespread social-media posts debunked by multiple health authorities; verdict: False.
  • Claim: Fertilizer A boosts crop yield by 25% in field trials. Peer-reviewed study supports; verdict: Verified.

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers