Get the FREE Ultimate OpenClaw Setup Guide →

nonlinear-solvers

npx machina-cli add skill HeshamFS/materials-simulation-skills/nonlinear-solvers --openclaw
Files (1)
SKILL.md
7.1 KB

Nonlinear Solvers

Goal

Provide a universal workflow to select a nonlinear solver, configure globalization strategies, and diagnose convergence for root-finding, optimization, and least-squares problems.

Requirements

  • Python 3.8+
  • NumPy (for Jacobian diagnostics)
  • SciPy (optional, for advanced analysis)

Inputs to Gather

InputDescriptionExample
Problem typeRoot-finding, optimization, least-squaresroot-finding
Problem sizeNumber of unknownsn = 10000
Jacobian availabilityAnalytic, finite-diff, unavailableanalytic
Jacobian costCheap or expensive to computeexpensive
ConstraintsNone, bounds, equality, inequalitynone
SmoothnessIs objective/residual smooth?yes
Residual historySequence of residual norms1,0.1,0.01,...

Decision Guidance

Solver Selection Flowchart

Is Jacobian available and cheap?
├── YES → Problem size?
│   ├── Small (n < 1000) → Newton (full)
│   └── Large (n ≥ 1000) → Newton-Krylov
└── NO → Is objective smooth?
    ├── YES → Memory limited?
    │   ├── YES → L-BFGS or Broyden
    │   └── NO → BFGS
    └── NO → Anderson acceleration or Picard

Quick Reference

Problem TypeFirst ChoiceAlternativeGlobalization
Small root-findingNewtonBroydenLine search
Large root-findingNewton-KrylovAndersonTrust region
OptimizationL-BFGSBFGSWolfe line search
Least-squaresLevenberg-MarquardtGauss-NewtonTrust region
Bound constrainedL-BFGS-BTrust-region reflectiveProjected

Script Outputs (JSON Fields)

ScriptKey Outputs
scripts/solver_selector.pyrecommended, alternatives, notes
scripts/convergence_analyzer.pyconverged, convergence_type, estimated_rate, diagnosis
scripts/jacobian_diagnostics.pycondition_number, jacobian_quality, rank_deficient
scripts/globalization_advisor.pystrategy, line_search_type, trust_region_type, parameters
scripts/residual_monitor.pypatterns_detected, alerts, recommendations
scripts/step_quality.pyratio, step_quality, accept_step, trust_radius_action

Workflow

  1. Characterize problem - Identify type, size, Jacobian availability
  2. Select solver - Run scripts/solver_selector.py
  3. Choose globalization - Run scripts/globalization_advisor.py
  4. Analyze Jacobian - If available, run scripts/jacobian_diagnostics.py
  5. Monitor residuals - During solve, use scripts/residual_monitor.py
  6. Analyze convergence - Run scripts/convergence_analyzer.py
  7. Evaluate steps - For trust region, use scripts/step_quality.py

Conversational Workflow Example

User: My Newton solver for a phase-field simulation is converging very slowly. After 50 iterations, the residual only dropped from 1 to 0.1.

Agent workflow:

  1. Analyze convergence:
    python3 scripts/convergence_analyzer.py --residuals 1,0.8,0.6,0.5,0.4,0.3,0.2,0.15,0.12,0.1 --json
    
  2. Check globalization strategy:
    python3 scripts/globalization_advisor.py --problem-type root-finding --jacobian-quality ill-conditioned --previous-failures 0 --json
    
  3. Recommend: Switch to trust region with Levenberg-Marquardt regularization, or use Newton-Krylov with better preconditioning.

Pre-Solve Checklist

  • Confirm problem type (root-finding, optimization, least-squares)
  • Assess Jacobian availability and cost
  • Check initial guess quality
  • Set appropriate tolerances
  • Choose globalization strategy
  • Prepare to monitor convergence

CLI Examples

# Select solver for large unconstrained optimization
python3 scripts/solver_selector.py --size 50000 --smooth --memory-limited --json

# Analyze convergence from residual history
python3 scripts/convergence_analyzer.py --residuals 1,0.1,0.01,0.001,0.0001 --tolerance 1e-6 --json

# Diagnose Jacobian quality
python3 scripts/jacobian_diagnostics.py --matrix jacobian.txt --json

# Get globalization recommendation
python3 scripts/globalization_advisor.py --problem-type optimization --jacobian-quality good --json

# Monitor residual patterns
python3 scripts/residual_monitor.py --residuals 1,0.8,0.9,0.7,0.75,0.6 --target-tolerance 1e-8 --json

# Evaluate step quality for trust region
python3 scripts/step_quality.py --predicted-reduction 0.5 --actual-reduction 0.4 --step-norm 0.8 --gradient-norm 1.0 --trust-radius 1.0 --json

Error Handling

ErrorCauseResolution
problem_size must be positiveInvalid sizeCheck problem dimension
constraint_type must be one of...Unknown constraintUse: none, bound, equality, inequality
residuals must be non-negativeInvalid residual dataCheck residual computation
Matrix file not foundInvalid pathVerify Jacobian file exists

Interpretation Guidance

Convergence Type

TypeMeaningAction
quadraticOptimal NewtonContinue, near solution
superlinearQuasi-Newton workingMonitor for stagnation
linearAcceptableMay improve with preconditioner
sublinearToo slowChange method or formulation
stagnatedNo progressCheck Jacobian, preconditioner
divergedIncreasing residualAdd globalization, check Jacobian

Jacobian Quality

QualityCondition NumberAction
good< 10⁶Standard Newton works
moderately-conditioned10⁶ - 10¹⁰Consider scaling
ill-conditioned> 10¹⁰Use regularization
near-singularReformulate or use LM

Step Quality (Trust Region)

Ratio ρQualityTrust Radius
ρ < 0very_poorShrink aggressively
ρ < 0.25marginalShrink
0.25 ≤ ρ < 0.75goodMaintain
ρ ≥ 0.75excellentExpand if at boundary

Limitations

  • No global convergence guarantee: All methods may fail for pathological problems
  • Jacobian accuracy: Finite-difference Jacobian may be inaccurate near discontinuities
  • Large dense problems: May require specialized solvers not covered here
  • Constrained optimization: Complex constraints need SQP or interior point methods

References

  • references/solver_decision_tree.md - Problem-based solver selection
  • references/method_catalog.md - Method details and parameters
  • references/convergence_diagnostics.md - Diagnosing convergence issues
  • references/globalization_strategies.md - Line search and trust region

Version History

  • v1.0.0 : Initial release with 6 analysis scripts

Source

git clone https://github.com/HeshamFS/materials-simulation-skills/blob/main/skills/core-numerical/nonlinear-solvers/SKILL.mdView on GitHub

Overview

Provide a universal workflow to select and configure nonlinear solvers for root-finding, optimization, and least-squares problems, including globalization strategies and convergence diagnostics. It covers Newton methods, quasi-Newton (BFGS, L-BFGS), Broyden, Anderson acceleration, and Jacobian quality analysis.

How This Skill Works

Assesses problem type, size, Jacobian availability/cost, and smoothness, then uses scripted guidance to choose a solver and globalization strategy. Diagnostics scripts (Jacobian diagnostics, convergence analysis, and residual monitoring) guide refinement and help diagnose convergence issues.

When to Use It

  • Small root-finding with a cheap analytic Jacobian (use Newton full).
  • Large-scale root-finding where Jacobian is available but expensive; Newton-Krylov is preferred.
  • Jacobian unavailable or expensive and the objective is smooth; consider L-BFGS or Broyden.
  • Objective is non-smooth or iterations stall; use Anderson acceleration or Picard methods.
  • Optimization or least-squares problems requiring robust globalization (line search or trust region).

Quick Start

  1. Step 1: Characterize problem type, size, Jacobian availability/cost, smoothness, and residual history.
  2. Step 2: Run solver_selector.py to pick a solver and alternatives.
  3. Step 3: Run globalization_advisor.py; if Jacobian is available, also run jacobian_diagnostics.py; monitor with convergence_analyzer.py and residual_monitor.py.

Best Practices

  • Characterize problem type, size, Jacobian availability and cost before solver choice.
  • Prefer Newton full for small problems with cheap Jacobian; Newton-Krylov for large problems.
  • When Jacobian is unavailable but the objective is smooth, use L-BFGS or Broyden; otherwise BFGS.
  • For non-smooth objectives, consider Anderson acceleration or Picard iterations.
  • Select globalization strategy (line search vs trust region) based on stability and problem type; use Wolfe or trust-region rules as appropriate.

Example Use Cases

  • Phase-field simulation with slow Newton convergence; diagnose convergence and switch globalization as needed.
  • Large PDE residual minimization using Newton-Krylov with preconditioning to handle scale.
  • Optimization problem solved with L-BFGS for parameter fitting.
  • Least-squares problem approached with Levenberg–Marquardt and a trust-region globalization.
  • Non-smooth objective scenarios where Anderson acceleration or Picard iterations are advantageous.

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers