Get the FREE Ultimate OpenClaw Setup Guide →

docs-seeker

npx machina-cli add skill Vibe-Builders/claude-prime/docs-seeker --openclaw
Files (1)
SKILL.md
4.8 KB

Think harder.

Role

You are a documentation hunter. Fetch the most relevant, up-to-date docs into context using the fastest available source.

Source Priority Chain

Try sources in this order — stop at the first that works:

PrioritySourceSpeedWhen to use
1Direct llms.txtFastestLibrary has known official llms.txt
2Context7FastAny library with a GitHub repo
3GitMCPFastAny GitHub repo (URL swap)
4WebSearchSlowerLast resort fallback

URL Patterns

Direct llms.txt:

{official-site}/llms.txt
{official-site}/llms-full.txt

Context7 (GitHub repos):

https://context7.com/{org}/{repo}/llms.txt
https://context7.com/{org}/{repo}/llms.txt?topic={keyword}

Context7 (websites):

https://context7.com/websites/{normalized-path}/llms.txt

GitMCP (any GitHub repo):

Replace github.com → gitmcp.io in any repo URL
https://gitmcp.io/{org}/{repo}

Known Direct llms.txt Sites

LibraryURL
Astrohttps://docs.astro.build/llms.txt
Drizzlehttps://orm.drizzle.team/llms.txt
Honohttps://hono.dev/llms.txt
Langchainhttps://python.langchain.com/llms.txt
Next.jshttps://nextjs.org/llms.txt
Remixhttps://remix.run/llms.txt
Stripehttps://docs.stripe.com/llms.txt
Supabasehttps://supabase.com/llms.txt
SvelteKithttps://svelte.dev/llms.txt
Tailwind CSShttps://tailwindcss.com/llms.txt
Vercelhttps://vercel.com/llms.txt

When a site isn't listed, try {official-docs-url}/llms.txt before falling back — many sites support it.

Known Repository Mappings

Query termContext7 path
next.js / nextjsvercel/next.js
astrowithastro/astro
remixremix-run/remix
shadcn / shadcn/uishadcn-ui/ui
better-authbetter-auth/better-auth
drizzledrizzle-team/drizzle-orm
honohonojs/hono
tanstack queryTanStack/query
tanstack routerTanStack/router
zustandpmndrs/zustand
zodcolinhacks/zod
trpctrpc/trpc
prismaprisma/prisma
playwrightmicrosoft/playwright
langchainlangchain-ai/langchain
fastapifastapi/fastapi

Topic Keyword Extraction

When the query targets a specific feature, extract a topic keyword:

  • Lowercase the keyword
  • Use the root word: "date picker" → date, "caching strategies" → caching
  • Drop generic suffixes: "OAuth setup" → oauth
  • Max 20 characters

Examples:

"shadcn date picker"       → topic=date,   path=shadcn-ui/ui
"Next.js middleware"        → topic=middleware, path=vercel/next.js
"Better Auth OAuth"         → topic=oauth,  path=better-auth/better-auth
"Stripe webhooks"           → topic=webhook, path=stripe (direct site)

Process

  1. Identify the library/framework from the query
  2. Try direct llms.txt if the site is in the known list (or guess {docs-url}/llms.txt)
  3. Try Context7 with topic if the query targets a specific feature
  4. Try Context7 general if no topic or topic URL 404s
  5. Try GitMCP if the GitHub repo is known
  6. WebSearch "{library} llms.txt" as last resort
  7. Read docs with WebFetch — deploy parallel subagents for large sets

Reading Strategy

URLs returnedStrategy
1-3 URLsRead directly with WebFetch
4-7 URLsLaunch 3 parallel subagents
8+ URLsLaunch 5-7 parallel subagents

When distributing to agents, categorize URLs:

  • Critical: Getting started, core API, main concepts
  • Important: Guides, configuration, common patterns
  • Supplementary: Advanced topics, edge cases, migration

Fallback Chain

Topic URL (404?) → General URL (404?) → Direct site llms.txt (404?) → GitMCP (404?) → WebSearch
  • On 404: move to next source immediately
  • On timeout: move to next source immediately
  • On empty response: move to next source immediately
  • Never retry a failed source

Edge Cases

  • Version-specific docs: Search "{library} v{version} llms.txt" or try /{version}/llms.txt
  • Multi-language docs: Try llms-{lang}.txt (e.g., llms-es.txt), fall back to English
  • Framework + plugins: Focus on core first, ask user which plugins matter

Constraints

  • Use WebFetch to read URLs — not MCP servers
  • Topic detection is your job — apply judgment, no regex needed
  • Prefer llms.txt over llms-full.txt unless the user wants comprehensive docs
  • Always report which source succeeded and how many docs were fetched
  • If all sources fail, say so clearly — don't fabricate documentation

Source

git clone https://github.com/Vibe-Builders/claude-prime/blob/main/.claude/skills/docs-seeker/SKILL.mdView on GitHub

Overview

docs-seeker fetches current docs for any library, framework, or tool and injects them into the AI's context. It uses a prioritized source chain (Direct llms.txt, Context7, GitMCP, then WebSearch) to quickly locate relevant, feature-specific references.

How This Skill Works

The skill identifies the library/framework from the query, then tries sources in priority order: direct llms.txt if known, then Context7 (with a topicKeyword if a feature is targeted), then GitMCP, and finally WebSearch as a last resort. Read found docs with WebFetch, deploying parallel subagents for large result sets and classifying URLs by importance.

When to Use It

  • Looking up API/docs for a known library or framework (e.g., Astro, LangChain)
  • Locating feature-specific references using topic keywords (e.g., a particular API or component)
  • Discovering documentation sources for any library, framework, or tool
  • Working with a GitHub repo that has a Context7 mapping
  • No direct source available—fallback to WebSearch llms.txt to locate docs

Quick Start

  1. Step 1: Identify the library/framework from the user query
  2. Step 2: Run the Source Priority Chain to locate available llms.txt or docs
  3. Step 3: Read and merge the docs into AI context with parallel fetching if needed

Best Practices

  • Identify the library/framework from the query early to pick the right doc source
  • Prefer direct llms.txt sources for speed when available
  • Leverage Context7 for GitHub repos to access topic-targeted docs
  • Extract topic keywords for targeted feature docs to improve relevance
  • Read from multiple URLs in parallel and categorize into Critical/Important/Supplementary

Example Use Cases

  • Astro docs: fetch from https://docs.astro.build/llms.txt
  • Next.js: use Context7 path vercel/next.js/llms.txt or Next.js llms.txt
  • LangChain: fetch from https://python.langchain.com/llms.txt
  • Stripe: fetch from https://docs.stripe.com/llms.txt
  • Fallback: WebSearch llms.txt for a library without a known source

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers