comfyui
npx machina-cli add skill G1Joshi/Agent-Skills/comfyui --openclawComfyUI
ComfyUI is a node-based GUI for Stable Diffusion. It gives you infinite control over the generation pipeline. 2025 update (UI Overhaul) makes it more accessible.
When to Use
- Complex Workflows: Hires.fix, Inpainting, ControlNet, and Upscaling in one pipeline.
- Video: AnimateDiff and video-to-video workflows.
- Optimization: Runs on lower VRAM than WebUI (A1111).
Core Concepts
Nodes & Noodles
Connect "Load Checkpoint" -> "CLIP Text Encode" -> "KSampler" -> "VAE Decode" -> "Save Image".
Workflow Format
Workflows are saved as PNG metadata. Drag a PNG into ComfyUI to load its workflow.
Custom Nodes
Community plugins (thousands available via ComfyUI Manager).
Best Practices (2025)
Do:
- Use ComfyUI Manager: Essential for installing missing custom nodes.
- Use XY Plot: To test different parameters (CFG vs Steps) grid.
- Save Workflows: Organize your JSON/PNG workflows.
Don't:
- Don't fear the spaghetti: It looks complex, but logic is linear.
References
Source
git clone https://github.com/G1Joshi/Agent-Skills/blob/main/skills/ai-ml/comfyui/SKILL.mdView on GitHub Overview
ComfyUI is a node-based GUI for Stable Diffusion that gives you granular control over every stage of image generation. It supports complex pipelines including hires.fix, inpainting, ControlNet, and upscaling, and it now offers a 2025 UI overhaul for easier use, including video workflows.
How This Skill Works
Build pipelines by wiring nodes such as Load Checkpoint, CLIP Text Encode, KSampler, VAE Decode, and Save Image. Workflows are saved as PNG metadata and can be loaded by dragging a PNG into ComfyUI. The ecosystem also supports thousands of community plugins via the ComfyUI Manager.
When to Use It
- Create complex pipelines that combine hires.fix, inpainting, ControlNet, and upscaling in a single run.
- Build video workflows with AnimateDiff and video-to-video for animated results.
- Operate with lower VRAM than WebUI to run on GPUs with limited memory.
- Experiment with parameters using XY Plot to grid CFG and Steps for tuning.
- Save and reuse pipelines by embedding workflows in PNG metadata and loading them by dragging PNGs.
Quick Start
- Step 1: Install ComfyUI and the ComfyUI Manager to access custom nodes.
- Step 2: Build a simple pipeline (Load Checkpoint -> CLIP Text Encode -> KSampler -> VAE Decode -> Save Image) or drag a PNG to load an existing workflow.
- Step 3: Run, tune parameters, and save the workflow as a PNG to reuse later.
Best Practices
- Use ComfyUI Manager to install missing custom nodes.
- Use XY Plot to test different parameters (CFG vs Steps) in a grid.
- Save workflows to organize your JSON/PNG pipelines.
- Don’t fear the spaghetti; the underlying logic is linear once mapped.
- Leverage community plugins to extend capabilities.
Example Use Cases
- HiRes image generation by chaining hires.fix, inpainting, and upscaling in one pipeline.
- Video-to-video or AnimateDiff workflows to create animated content.
- ControlNet-guided generation to preserve structure from a sketch or map.
- Optimized runs on low VRAM hardware compared to WebUI.
- Drag a saved PNG to quickly load and reuse a complex workflow.