M
HK-101 Living RAG
@Metatronsdoob369
npx machina-cli add skill @Metatronsdoob369/hk101-living-rag --openclawFiles (1)
SKILL.md
362 B
claw-rag
Simple RAG over local text/markdown.
Inputs
- query (string): question to answer.
- docsPath (string, optional): folder of docs (default ./docs relative to CWD).
- k (number, optional): number of top matches (default 3).
Output
- answer: synthesized answer from matches.
- matches: [{path, score, snippet}...]
Requires: OPENAI_API_KEY in env.
Overview
HK-101 Living RAG provides answers by retrieving and synthesizing information from local text or Markdown files using a retrieval-augmented generation approach. It operates against a local docsPath (default ./docs relative to CWD) with a top-k (default 3) to tailor results, enabling privacy-preserving Q&A from your own docs.
How This Skill Works
The tool searches the local docsPath for the query, retrieves the top-k matches, and feeds the snippets to an LLM to synthesize a concise answer. The output includes an answer and a list of matches with path, score, and snippet. It requires OPENAI_API_KEY to be set in the environment.
When to Use It
- You have internal manuals or knowledge bases stored as Markdown or text in a local folder.
- You need to answer questions while keeping data on-premises to protect privacy.
- You want the LLM to be limited to information from your own docs (no external web data).
- You require quick Q&A for product or policy documentation without building a separate index.
- You need to provide synthesized answers with source snippets for audits or compliance.
Quick Start
- Step 1: Place your Markdown or text docs under ./docs (or your preferred docsPath).
- Step 2: Set your inputs: query, optional docsPath, and k (default 3); ensure OPENAI_API_KEY is in the environment.
- Step 3: Run the query to receive an synthesized answer and a list of matching snippets with their sources.
Best Practices
- Organize docs in ./docs with clear structure and consistent headings to improve matching.
- Start with a small k (e.g., 3) and adjust up or down based on result quality and latency.
- Keep the docs focused on high-value questions to reduce noise in matches.
- Ensure OPENAI_API_KEY is securely set in the environment and monitor usage/costs.
- Validate snippets in matches to verify answer accuracy before user-facing deployment.
Example Use Cases
- Answering internal product manual questions from the local docs folder.
- Retrieving API usage details and examples from codebase markdown files.
- Summarizing quarterly reports stored as local documentation for quick briefs.
- Generating customer-support FAQs from a knowledge base of PDFs/markdowns.
- Onboarding new employees with policies and procedures drawn from internal docs.
Frequently Asked Questions
Add this skill to your agents