uv-pytest-unit-testing
npx machina-cli add skill gaelic-ghost/python-skills/uv-pytest-unit-testing --openclawUv Pytest Unit Testing
Overview
Use this skill to standardize pytest setup and execution for uv-managed Python repositories, including single-project repos and uv workspaces.
Workflow
- Detect repository mode.
- Treat repo as workspace when
pyproject.tomldefines[tool.uv.workspace]. - Treat repo as single project otherwise.
- Bootstrap pytest dependencies and baseline config.
- Run
scripts/bootstrap_pytest_uv.sh --workspace-root <repo>. - Add
--package <member-name>for workspace member package setup. - Add
--with-covwhenpytest-covshould be installed and baseline coverage flags added. - Add
--dry-runto preview all actions without mutating files.
- Run tests with uv.
- Run
scripts/run_pytest_uv.sh --workspace-root <repo>for root-project execution. - Run
scripts/run_pytest_uv.sh --workspace-root <repo> --package <member-name>for workspace member execution. - Pass through pytest selectors/options after
--, for example:scripts/run_pytest_uv.sh --workspace-root <repo> -- --maxfail=1 -qscripts/run_pytest_uv.sh --workspace-root <repo> --package api --path tests/unit -- -k auth -m "not slow"
- Apply balanced quality gates.
- Require passing test runs before concluding work.
- Recommend coverage reporting as guidance, not a hard threshold, unless user explicitly requests enforced minimum coverage.
- Troubleshoot failures in this order.
- Confirm command context: root vs
--packagerun target. - Confirm test discovery layout:
tests/,test_*.py,*_test.py. - Confirm marker registration in
tool.pytest.ini_options.markerswhen custom markers are used. - Confirm import path assumptions (package install mode, working directory, and module names).
Test Authoring Guidance
- Keep fast unit tests under
tests/unitand integration-heavy tests undertests/integrationwhen repo size warrants separation. - Use fixtures for setup reuse, and keep fixture scope minimal (
functionby default). - Use
@pytest.mark.parametrizefor matrix-style cases instead of hand-written loops. - Use
monkeypatchfor environment variables and runtime dependency replacement. - Register custom marks in config to avoid marker warnings.
Automation Suitability
- Codex App automation: High. Strong recurring fit for test health checks, failure triage, and drift detection.
- Codex CLI automation: High. Strong fit for non-interactive test setup and targeted test sweeps.
Codex App Automation Prompt Template
Use $uv-pytest-unit-testing.
Scope boundaries:
- Work only inside <REPO_PATH>.
- Operate only on pytest setup and test execution tasks.
- Do not perform unrelated code refactors.
Task:
1. Detect repository mode from <WORKSPACE_ROOT>/pyproject.toml.
2. If <DRY_RUN_BOOTSTRAP:TRUE|FALSE> is TRUE, run:
`scripts/bootstrap_pytest_uv.sh --workspace-root <WORKSPACE_ROOT> <PACKAGE_FLAG> <WITH_COV_FLAG> --dry-run`
3. If <DRY_RUN_BOOTSTRAP:TRUE|FALSE> is FALSE, run:
`scripts/bootstrap_pytest_uv.sh --workspace-root <WORKSPACE_ROOT> <PACKAGE_FLAG> <WITH_COV_FLAG>`
4. Run tests with:
`scripts/run_pytest_uv.sh --workspace-root <WORKSPACE_ROOT> <PACKAGE_FLAG> <TEST_PATH_FLAG> -- <PYTEST_ARGS>`
5. Keep package-targeted runs explicit when <PACKAGE_NAME_OR_EMPTY> is set.
Output contract:
1. STATUS: PASS or FAIL
2. SETUP: what bootstrap actions ran
3. TEST_RESULTS: concise pass/fail summary
4. FAILURES: grouped likely causes
5. NEXT_STEPS: minimal remediation actions
Codex CLI Automation Prompt Template
codex exec --full-auto --sandbox workspace-write --cd "<REPO_PATH>" "<PROMPT_BODY>"
<PROMPT_BODY> template:
Use $uv-pytest-unit-testing.
Limit scope to pytest setup and execution in <WORKSPACE_ROOT>.
Run bootstrap in dry-run or real mode based on <DRY_RUN_BOOTSTRAP:TRUE|FALSE>.
Run tests with explicit package targeting when <PACKAGE_NAME_OR_EMPTY> is set.
Return STATUS, setup actions, concise test summary, grouped likely causes for failures, and minimal next steps.
Customization Placeholders
<REPO_PATH><WORKSPACE_ROOT><PACKAGE_NAME_OR_EMPTY><PACKAGE_FLAG><TEST_PATH_OR_EMPTY><TEST_PATH_FLAG><PYTEST_ARGS><WITH_COV_FLAG:--with-cov|EMPTY><DRY_RUN_BOOTSTRAP:TRUE|FALSE>
Interactive Customization Workflow
- Ask whether users want bootstrap mode or run mode.
- Gather workspace root and optional package target.
- For bootstrap mode, gather
with_covanddry_run. - For run mode, gather optional test path and optional pytest args.
- Return both:
- A YAML profile for durable reuse.
- The exact command to run.
- Use this precedence order:
- CLI flags
--configprofile file.codex/profiles/uv-pytest-unit-testing/customization.yaml~/.config/gaelic-ghost/python-skills/uv-pytest-unit-testing/customization.yaml- Script defaults
- If users want temporary reset behavior:
--bypassing-all-profiles--bypassing-repo-profile--deleting-repo-profile
- If users provide no customization or profile files, keep existing script defaults unchanged.
- See
references/interactive-customization.mdfor schema and examples.
References
- Use
references/pytest-workflow.mdfor pytest conventions, config keys, fixtures, markers, and troubleshooting. - Use
references/uv-workspace-testing.mdfor uv workspace execution patterns (uv run,uv run --package, package-targeted test runs).
Resources
scripts/
scripts/bootstrap_pytest_uv.sh: Install pytest dev dependencies and append baselinetool.pytest.ini_optionswhen missing.scripts/run_pytest_uv.sh: Run pytest via uv for root project or workspace member package, with passthrough args.
references/
references/pytest-workflow.md: Practical pytest setup and usage guidance.references/uv-workspace-testing.md: uv execution guidance for single-project and workspace repositories.
Source
git clone https://github.com/gaelic-ghost/python-skills/blob/main/uv-pytest-unit-testing/SKILL.mdView on GitHub Overview
This skill standardizes pytest setup and execution for uv-managed Python repositories, including both single-project repos and uv workspaces. It covers configuring pytest in pyproject.toml, bootstrapping pytest dev dependencies, running tests at root or per-package, using layered YAML profiles, and organizing tests with fixtures, markers, and parametrize.
How This Skill Works
The skill detects repo mode by inspecting pyproject.toml for [tool.uv.workspace]. It bootstraps pytest dependencies and a baseline config via scripts/bootstrap_pytest_uv.sh, optionally targeting a workspace member with --package and enabling coverage with --with-cov. Tests are executed through scripts/run_pytest_uv.sh, with pytest arguments forwarded after -- to support selectors and options.
When to Use It
- Set up and update pytest configuration in pyproject.toml for an uv repository.
- Bootstrap pytest dev dependencies and baseline config for a root or workspace.
- Run tests at the repository root or for a specific workspace member using uv run.
- Customize pytest workflow defaults using layered YAML profiles.
- Troubleshoot test discovery issues, import failures, and marker registrations.
Quick Start
- Step 1: Detect repo mode from <REPO_PATH>/pyproject.toml.
- Step 2: Bootstrap pytest dependencies: choose either dry-run or actual run: - If DRY_RUN_BOOTSTRAP is TRUE: scripts/bootstrap_pytest_uv.sh --workspace-root <WORKSPACE_ROOT> <PACKAGE_FLAG> <WITH_COV_FLAG> --dry-run - If DRY_RUN_BOOTSTRAP is FALSE: scripts/bootstrap_pytest_uv.sh --workspace-root <WORKSPACE_ROOT> <PACKAGE_FLAG> <WITH_COV_FLAG>
- Step 3: Run tests: scripts/run_pytest_uv.sh --workspace-root <WORKSPACE_ROOT> <PACKAGE_FLAG> <TEST_PATH_FLAG> -- <PYTEST_ARGS>
Best Practices
- Keep fast unit tests under tests/unit to speed feedback.
- Use fixtures for setup reuse with minimal scope.
- Leverage @pytest.mark.parametrize for matrix tests instead of loops.
- Register custom marks in the pytest config to avoid warnings.
- Use monkeypatch to isolate environment variables and dependencies.
Example Use Cases
- Bootstrap pytest for a workspace root with coverage: scripts/bootstrap_pytest_uv.sh --workspace-root <repo> --with-cov
- Run all tests from the repo root: scripts/run_pytest_uv.sh --workspace-root <repo>
- Run tests for a specific member package: scripts/run_pytest_uv.sh --workspace-root <repo> --package api
- Pass through pytest selectors, e.g., -- -k auth -m 'not slow'
- Troubleshoot by verifying tests layout and marker registration in tool.pytest.ini_options.markers