Get the FREE Ultimate OpenClaw Setup Guide →

competitive-analysis

npx machina-cli add skill liqiongyu/lenny_skills_plus/competitive-analysis --openclaw
Files (1)
SKILL.md
7.3 KB

Competitive Analysis

Scope

Covers

  • Mapping competitive alternatives (status quo, workarounds, analog/non-consumption, direct + indirect competitors)
  • Building a competitor landscape grounded in customer decision criteria
  • Turning analysis into actionable artifacts: positioning hypotheses, win themes, battlecards, and a monitoring plan

When to use

  • “Do a competitive analysis / competitor landscape for our product.”
  • “Why are we losing deals to <competitor>?”
  • “What are the real alternatives if we didn’t exist?”
  • “Help us differentiate and position vs competitors.”
  • “Create sales battlecards and win/loss takeaways.”

When NOT to use

  • You need market sizing / TAM/SAM/SOM as the primary output (different workflow)
  • You don’t know the target customer, core use case, or the decision this analysis should support
  • You only need a quick list of competitors (no synthesis, no artifacts)
  • You’re seeking confidential or non-public competitor information (do not attempt)

Inputs

Minimum required

  • Product + target customer segment + core use case (what job is being done)
  • The decision to support (e.g., positioning, sales enablement, roadmap bets, pricing, market entry)
  • 3–10 known competitors/alternatives (or “unknown—please map them”)
  • Any available evidence (links, win/loss notes, call transcripts, customer quotes, pricing pages, reviews)
  • Constraints: geography, ICP, price band, compliance/regulation (if relevant), time box

Missing-info strategy

  • Ask up to 5 questions from references/INTAKE.md.
  • If answers aren’t available, proceed with explicit assumptions and label unknowns. Provide 2–3 plausible alternative scopes (narrow vs broad).

Outputs (deliverables)

Produce a Competitive Analysis Pack in Markdown (in-chat; or as files if requested):

  1. Context snapshot (decision, ICP, use case, constraints, time box)
  2. Competitive alternatives map (direct/indirect/status quo/workarounds/analog)
  3. Competitor landscape table (top 5–10) with evidence links + confidence
  4. Customer decision criteria + comparison matrix (customer POV)
  5. Differentiation & positioning hypotheses (why win, why lose, proof points)
  6. Win themes + loss risks (objections, landmines, traps)
  7. Battlecards (3–5 priority competitors)
  8. Monitoring plan (signals, cadence, owners, update triggers)
  9. Risks / Open questions / Next steps (always included)

Templates: references/TEMPLATES.md

Workflow (8 steps)

1) Intake + decision framing

  • Inputs: User context; references/INTAKE.md.
  • Actions: Confirm the decision, ICP, use case, geography, and time box. Define what “good” looks like (who will use this and for what).
  • Outputs: Context snapshot.
  • Checks: A stakeholder can answer: “What decision will this analysis change?”

2) Map competitive alternatives (not just logos)

  • Inputs: Use case + customer job.
  • Actions: List what customers do instead: status quo, internal build, manual workaround, analog tools, agencies/outsourcing, and direct/indirect competitors. Identify the “true competitor” for the deal.
  • Outputs: Competitive alternatives map + short notes per alternative.
  • Checks: At least 1–2 non-obvious alternatives appear (workarounds / analog / non-consumption).

3) Select the focus set + collect evidence (time-boxed)

  • Inputs: Alternatives map; available sources.
  • Actions: Pick 5–10 focus alternatives (by frequency/impact). Gather publicly available facts (positioning, features, pricing, distribution, target ICP) and internal learnings (win/loss, sales notes). Track confidence and unknowns.
  • Outputs: Evidence log + initial landscape table.
  • Checks: Each competitor row has at least 2 evidence points (link/quote/data) or is explicitly labeled “low confidence”.

4) Build the comparison from the customer’s perspective

  • Inputs: Focus set + evidence.
  • Actions: Define 6–10 customer decision criteria (JTBD outcomes, constraints, trust, time-to-value, switching cost, price, ecosystem fit). Compare alternatives on criteria and surface “why they win”.
  • Outputs: Decision criteria list + comparison matrix.
  • Checks: Criteria are framed as customer outcomes/risks (not internal feature checklists).

5) Derive differentiation + positioning hypotheses

  • Inputs: Matrix + wins/losses.
  • Actions: Write 2–3 positioning hypotheses: (a) who we’re for, (b) the value we deliver, (c) why we’re different vs the true alternative, (d) proof points, (e) tradeoffs/non-goals.
  • Outputs: Differentiation & positioning section.
  • Checks: Each hypothesis names the competitive alternative it’s positioning against.

6) Translate into win themes + battlecards

  • Inputs: Positioning hypotheses + competitor notes.
  • Actions: Create 3–5 win themes and 3–5 loss risks. Produce battlecards for priority competitors (how to win, landmines, objection handling, traps to avoid).
  • Outputs: Win/loss section + battlecards.
  • Checks: Battlecards contain do/don’t talk tracks and are usable in a live sales call.

7) Recommend actions (product, messaging, GTM)

  • Inputs: Findings.
  • Actions: Propose 5–10 actions: product bets, messaging changes, pricing/packaging, distribution, partnerships, and “stop doing” items. Tie each action to a win theme or loss risk.
  • Outputs: Recommendations list with rationale and owners (if known).
  • Checks: Each recommendation is specific enough to execute next week/month.

8) Monitoring + quality gate + finalize

  • Inputs: Draft pack.
  • Actions: Define monitoring signals, cadence, and update triggers. Run references/CHECKLISTS.md and score with references/RUBRIC.md. Add Risks/Open questions/Next steps.
  • Outputs: Final Competitive Analysis Pack.
  • Checks: Pack is shareable as-is; assumptions and confidence levels are explicit.

Quality gate (required)

Examples

Example 1 (B2B SaaS): “We keep losing deals to Competitor X. Build a competitive alternatives map and a battlecard for X.”
Expected: alternatives map (incl. status quo), decision criteria, X battlecard, win themes/loss risks, and a monitoring plan.

Example 2 (Consumer subscription): “We’re repositioning for a new segment. Analyze alternatives and propose 2 positioning hypotheses.”
Expected: comparison matrix by customer criteria and two clear positioning options with proof points and tradeoffs.

Boundary example: “List every competitor in our industry worldwide.”
Response: narrow scope (ICP, geography, category) and propose a focused set + monitoring plan; otherwise output becomes a low-signal directory of logos.

Source

git clone https://github.com/liqiongyu/lenny_skills_plus/blob/main/skills/competitive-analysis/SKILL.mdView on GitHub

Overview

Generates a Competitive Analysis Pack that includes a competitive alternatives map, a grounded competitor landscape, differentiation and positioning hypotheses, battlecards, and a monitoring plan. This helps teams research competition, align on customer decision criteria, and turn insights into actionable artifacts for sales, product, and strategy.

How This Skill Works

The process starts with an intake to lock decision, ICP, and use case, then maps competitive alternatives (status quo, workarounds, analogs, direct/indirect competitors). It collects evidence, selects a focused set of competitors, formulates differentiation hypotheses and win themes, creates battlecards, and outputs a monitoring plan for ongoing tracking.

When to Use It

  • When mapping the competitive landscape for a new product
  • When investigating why deals are lost to a specific competitor
  • When evaluating all real alternatives if your product did not exist
  • When developing differentiation and positioning messaging for sales
  • When creating battlecards and a monitoring plan for ongoing sales enablement

Quick Start

  1. Step 1: Define decision, ICP, use case, geography, and time box
  2. Step 2: Map competitive alternatives including status quo, workarounds, analogs, and direct/indirect competitors
  3. Step 3: Compile the Competitive Analysis Pack with landscape, differentiation hypotheses, battlecards, and monitoring plan

Best Practices

  • Start with a clear decision, ICP, and core use case
  • Map not just logos—include status quo, workarounds, analogs, and indirect competitors
  • Base conclusions on verifiable evidence (pricing pages, transcripts, quotes, reviews)
  • Focus on 5–10 core alternatives; avoid overloading with too many
  • Align artifacts to the intended use (positioning, win themes, battlecards, monitoring cadence)

Example Use Cases

  • SaaS CRM: competitive landscape vs Salesforce/HubSpot with pricing and messaging gaps
  • Project management: differentiation from Asana, Jira, and Trello focusing on automation
  • Cybersecurity: comparing incumbents and non-consumption alternatives to highlight gaps
  • Analytics/BI: mapping against Tableau and Power BI with decision criteria
  • Marketing automation: battlecards for enterprise deals against Marketo and HubSpot

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers