Skip to content

Add ASV benchmarks for 6 modules changed in v0.9.5 #1137

@brendancol

Description

@brendancol

Author of Proposal: @brendan

Reason or Problem

The v0.9.4 to v0.9.5 cycle landed memory guards and lazy-reduction fixes across 9 modules (#1112, #1115, #1117, #1119, #1121, #1123, #1125, #1127, #1129, #1131). Six of those modules have no ASV benchmark, so there's no way to catch performance regressions from these changes:

Module PR ASV benchmark?
normalize #1125 missing
diffusion #1117 missing
erosion #1121 missing
balanced_allocation #1115 missing
dasymetric #1127 missing
reproject #1131 missing

The CI workflow (benchmarks.yml) monitors 5 classes (Slope, Proximity, Zonal, CostDistance, Focal). None of them cover these gaps.

Proposal

Design:
Add ASV benchmark modules for the 6 uncovered modules using the existing Benchmarking base class in benchmarks/benchmarks/common.py. Each module gets a class with time_* methods calling the public API, parameterized over grid sizes and backends (numpy, cupy, dask).

Where the function needs more than a single raster (e.g. dasymetric.disaggregate needs zones + source values, reproject needs a target CRS), the setup method builds the fixtures.

After adding benchmarks, rerun with asv dev to get baselines and check for regressions from the memory guard changes.

Usage:

cd benchmarks
asv dev -b Normalize
asv dev -b Diffusion
asv continuous origin/master HEAD -b "Normalize|Diffusion|Erosion|BalancedAllocation|Dasymetric|Reproject"

Value: Covers the modules that changed the most this release cycle and lets us catch regressions from the memory guard work.

Stakeholders and Impacts

Maintainers reviewing future PRs to these modules get benchmark coverage. No impact on existing benchmarks.

Drawbacks

Adds 6 files to the benchmark suite. They follow an established pattern, so maintenance cost is low.

Alternatives

Could also add these to the CI filter in benchmarks.yml, but that increases CI time. Manual asv dev benchmarks are a reasonable middle ground for now.

Unresolved Questions

Whether to add reproject to the CI filter. It's compute-heavy and may slow CI noticeably.

Additional Notes

Performance sweep state from 2026-03-31 has OOM verdicts for all 44 modules. The newly benchmarked modules include both SAFE (normalize, diffusion) and WILL OOM (erosion, balanced_allocation) verdicts, useful for validating memory guards under benchmark load.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requestinfrastructureCI, benchmarks, and toolingperformancePR touches performance-sensitive code

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions