Skip to content

Fix deprecated code#259

Open
gampnico wants to merge 8 commits intofmaussion:masterfrom
gampnico:fix-deprecated-code
Open

Fix deprecated code#259
gampnico wants to merge 8 commits intofmaussion:masterfrom
gampnico:fix-deprecated-code

Conversation

@gampnico
Copy link

@gampnico gampnico commented Mar 13, 2026

WIP, not ready for merging. No need to run the workflow until it's marked ready for review.

  • Removes skip decorators for cartopy tests.

Checks

  • tests added / passed
  • passes git diff upstream/master | flake8 --diff
  • whatsnew entry

@fmaussion
Copy link
Owner

How did you test for the cartopy bug? Still seems to fail with pytest-mpl. Seems to be a real issue though.

@gampnico
Copy link
Author

Linking OGGM/oggm#1871

@gampnico
Copy link
Author

gampnico commented Mar 16, 2026

How did you test for the cartopy bug? Still seems to fail with pytest-mpl. Seems to be a real issue though.

I'm having a look at the generated images, and it seems like this happens due to different zoom levels/axis ticks/labels, rather than a difference in the results themselves (except for test_extendednorm, which matches the old baseline image, not the newest version you uploaded). How did you generate the new baseline images?

Example of a failed test image for test_colormaps - here the only visible perceptible difference are the additional tick marks and some slight artifacting in the gradient.
result-failed-diff

Resulting image:
result

Baseline:
baseline

@fmaussion
Copy link
Owner

The extended norm test does not fail after my fix from yesterday? The two tests that fails on e.g. py314 in your PR (see github workflow) are these:

=================================== FAILURES ===================================
_________________________________ test_cartopy _________________________________
Error: Image files did not match.
  RMS Value: 15.595296833767971
  Expected:  
    /tmp/salem-mpl-results/salem.tests.test_graphics.test_cartopy/baseline.png
  Actual:    
    /tmp/salem-mpl-results/salem.tests.test_graphics.test_cartopy/result.png
  Difference:
    /tmp/salem-mpl-results/salem.tests.test_graphics.test_cartopy/result-failed-diff.png
  Tolerance: 
    10
______________________________ test_cartopy_polar ______________________________
Error: Image files did not match.
  RMS Value: 40.276315905463186
  Expected:  
    /tmp/salem-mpl-results/salem.tests.test_graphics.test_cartopy_polar/baseline.png
  Actual:    
    /tmp/salem-mpl-results/salem.tests.test_graphics.test_cartopy_polar/result.png
  Difference:
    /tmp/salem-mpl-results/salem.tests.test_graphics.test_cartopy_polar/result-failed-diff.png
  Tolerance: 
    7
=============================== warnings summary ===============================

The failing images are uploaded as part of the github workflow ("Upload pytest-mpl artefacts") and the workflow also shares a link (e.g. https://github.com/fmaussion/salem/actions/runs/23135044796/artifacts/5941503514). the differences are substantial:

Baseline:
baseline

Result:
result

One difference is due to a coastline update (which is OK) but the middle-right test is failing because the points are not at the center of the grid point anymore.

@fmaussion
Copy link
Owner

yeah its really just the grid points the issue, the rest is fine

@gampnico
Copy link
Author

gampnico commented Mar 17, 2026

Hm, this is interesting. Additional tests, including test_extendednorm, fail locally if all of salem's optional dependencies are installed purely via pip (rather than the hybrid installation used by the CI yaml files). It seems like mamba's solver is using numpy<2.0, pandas<3.0, and geopandas 0.14.4; a pip-only installation uses the more recent major releases.

@fmaussion
Copy link
Owner

It seems like mamba's solver is using numpy<2.0, pandas<3.0, and geopandas 0.14.4; a pip-only installation uses the more recent major releases.

yeah this is pretty much a good summary of the mess that it is 🤷 . But are you sure about this? My local mamba conda-forge install has the following versions:

print(show_versions())

OGGM environment:

System info:

python: 3.14.0.final.0
python-bits: 64
OS: Darwin
OS-release: 23.6.0
machine: x86_64
processor: i386

Packages info:

oggm: 1.6.3.dev42+g66a7c4a8a
numpy: 2.4.2
scipy: 1.17.1
pandas: 3.0.1
geopandas: 1.1.2
netCDF4: 1.7.4
matplotlib: 3.10.8
rasterio: 1.5.0
fiona: None
pyproj: 3.7.2
shapely: 2.1.2
xarray: 2026.2.0
dask: 2026.1.2
salem: 0.3.12.dev8+g84d7f7e5c

and: are you saying that the cartopy issues are disappearing with more recent installs? will have to test later today

@gampnico
Copy link
Author

gampnico commented Mar 17, 2026

It may also be solver limitations around how/when certain dependencies are installed, eg running pip install on a single package. I'll take a look, but maybe explicitly setting a minimum version will help.

Installing an env exactly like the ci requirements file leads to the same two test failures as the workflow.

The multiple test failures only occur in cases where I install Salem's optional dependencies using pip, but this affects #260 if we move to a pyproject-based CI.

@gampnico gampnico marked this pull request as ready for review March 19, 2026 12:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants