fastai
Scannednpx machina-cli add skill G1Joshi/Agent-Skills/fastai --openclawfastai
fastai is a layered API on top of PyTorch. It popularized Transfer Learning and good defaults (One Cycle Policy).
When to Use
- Learning DL: The best course/library for beginners ("Practical Deep Learning for Coders").
- Quick Baselines: Get state-of-the-art results in 5 lines of code.
Core Concepts
Defaults
fastai chooses the best learning rate finder, optimizer (AdamW), and augmentations for you.
Layered API
You can use the high-level Learner or peel back layers to raw PyTorch.
Best Practices (2025)
Do:
- Watch the Course: Jeremy Howard's course updates annually and is world-class.
- Use
nbdev: fastai's literate programming environment is powerful.
Don't:
- Don't get stuck: If you need something very custom, drop down to PyTorch.
References
Source
git clone https://github.com/G1Joshi/Agent-Skills/blob/main/skills/ai-ml/fastai/SKILL.mdView on GitHub Overview
fastai is a layered API on top of PyTorch designed for practical deep learning. It popularizes Transfer Learning and practical defaults such as the One Cycle Policy to help you build effective models quickly.
How This Skill Works
fastai automatically selects sensible defaults (LR finder, AdamW, augmentations) to streamline training. It offers a high-level Learner interface for rapid prototyping and a pathway to lower-level PyTorch for deep customization.
When to Use It
- Learning DL: ideal for beginners and the Practical Deep Learning for Coders course.
- Quick Baselines: achieve state-of-the-art results in just a few lines of code.
- Transfer Learning: easily fine-tune pre-trained models on new datasets.
- Experiment with sensible defaults (LR finder, AdamW, augmentations) to accelerate experiments.
- Need customization or deeper control? Peel back to raw PyTorch or mix in custom blocks.
Quick Start
- Step 1: Install and import: pip install fastai; from fastai.vision.all import *
- Step 2: Prepare data: dls = ImageDataLoaders.from_folder(path, valid_pct=0.2)
- Step 3: Create a learner and train: learn = cnn_learner(dls, resnet34, metrics=accuracy); learn.fit_one_cycle(5)
Best Practices
- Watch the Course: Jeremy Howard's Practical Deep Learning for Coders updates annually.
- Use nbdev: fastai's literate programming environment speeds reproducible experiments.
- Trust defaults for quick results: LR finder, AdamW, and augmentations.
- Start with the high-level Learner; drop to PyTorch for heavy customization.
- Keep your workflow iterative: combine notebooks and experiments with nbdev.
Example Use Cases
- Train a transfer-learned image classifier on a new dataset with minimal code.
- Create a baseline model in five lines of code for a new task.
- Fine-tune a pre-trained model on a different domain using One Cycle Policy.
- Peek under the hood: switch from Learner to raw PyTorch to customize a training loop.
- Document experiments with nbdev to share reproducible results.