Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CI: Use pytest '--override-ini' to override 'addopts' setting in the "Benchmarks" workflow #3620

Merged
merged 5 commits into from
Nov 15, 2024

Conversation

seisman
Copy link
Member

@seisman seisman commented Nov 15, 2024

Description of proposed changes

In the benchmarks workflow, we don't care if the generated images are correct or not, so we don't do the image comparisons. Then we see many warnings (first noticed in #2923 but are ignored) like :

  pygmt/tests/test_basemap.py::test_basemap
    /home/runner/micromamba/envs/pygmt/lib/python3.12/site-packages/_pytest/python.py:163: PytestReturnNotNoneWarning: Expected None, but pygmt/tests/test_basemap.py::test_basemap returned <pygmt.figure.Figure object at 0x42dd3630>, which will be an error in a future version of pytest.  Did you mean to use `assert` instead of `return`?
      warnings.warn(

I think the warnings mean that the "pytest-mpl" plugin is not activated.

However, with the latest pytest-codspeed v3.0.0, the Benchmarks workflow suddenly broke (initially reported in #2933 (comment)). Now the workflow fails with errors like:

  =================================== FAILURES ===================================
  _________________________________ test_basemap _________________________________
  Image file not found for comparison test in: 
  	/home/runner/work/pygmt/pygmt/pygmt/tests/baseline
  (This is expected for new tests.)
  Generated Image: 
  	/home/runner/work/pygmt/pygmt/results/pygmt.tests.test_basemap.test_basemap/result.png

The errors mean that the pytest-mpl plugin is activated now.

This PR fixes the issue by using the --override-ini option to override the addopts option, so that the --mpl option is not added automatically.

With this PR, the remaining warnings are:

  =============================== warnings summary ===============================
  ../../../micromamba/envs/pygmt/lib/python3.12/site-packages/numpy/_core/getlimits.py:548
    /home/runner/micromamba/envs/pygmt/lib/python3.12/site-packages/numpy/_core/getlimits.py:548: UserWarning: Signature b'\x00\xd0\xcc\xcc\xcc\xcc\xcc\xcc\xfb\xbf\x00\x00\x00\x00\x00\x00' for <class 'numpy.longdouble'> does not match any known type: falling back to type probe function.
    This warnings indicates broken support for the dtype!
      machar = _get_machar(dtype)
  pygmt/tests/test_clib_put_matrix.py::test_put_matrix_grid
    <frozen importlib._bootstrap>:488: RuntimeWarning: numpy.ndarray size changed, may indicate binary incompatibility. Expected 16 from C header, got 96 from PyObject
  pygmt/tests/test_geopandas.py::test_geopandas_info_geodataframe
  pygmt/tests/test_geopandas.py::test_geopandas_info_geodataframe
    /home/runner/micromamba/envs/pygmt/lib/python3.12/site-packages/pyogrio/geopandas.py:662: UserWarning: 'crs' was not provided.  The output dataset will not have projection information defined and may not be usable in other systems.
      write(

@seisman seisman added the run/benchmark Trigger the benchmark workflow in PRs label Nov 15, 2024
@seisman seisman added this to the 0.14.0 milestone Nov 15, 2024
Copy link

codspeed-hq bot commented Nov 15, 2024

CodSpeed Performance Report

Merging #3620 will not alter performance

Comparing ci/benchmarks (e4dcb62) with main (66b22dc)

🎉 Hooray! pytest-codspeed just leveled up to 3.0.0!

A heads-up, this is a breaking change and it might affect your current performance baseline a bit. But here's the exciting part - it's packed with new, cool features and promises improved result stability 🥳!
Curious about what's new? Visit our releases page to delve into all the awesome details about this new version.

Summary

✅ 100 untouched benchmarks

🆕 4 new benchmarks
⁉️ 1 dropped benchmarks

⚠️ Please fix the performance issues or acknowledge them on CodSpeed.

Benchmarks breakdown

Benchmark main ci/benchmarks Change
🆕 test_virtualfile_from_vectors_one_string_or_object_column[pyarrow] N/A 8.4 ms N/A
⁉️ test_text_multiple_lines_of_text 14.1 ms N/A N/A
🆕 test_text_multiple_lines_of_text[list] N/A 14.1 ms N/A
🆕 test_text_multiple_lines_of_text[numpy] N/A 14.1 ms N/A
🆕 test_text_multiple_lines_of_text[pyarrow] N/A 14.2 ms N/A

@seisman seisman added maintenance Boring but important stuff for the core devs needs review This PR has higher priority and needs review. skip-changelog Skip adding Pull Request to changelog labels Nov 15, 2024
@seisman seisman marked this pull request as ready for review November 15, 2024 07:18
@seisman seisman removed needs review This PR has higher priority and needs review. run/benchmark Trigger the benchmark workflow in PRs labels Nov 15, 2024
@seisman seisman merged commit 7b5819c into main Nov 15, 2024
8 of 10 checks passed
@seisman seisman deleted the ci/benchmarks branch November 15, 2024 08:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
maintenance Boring but important stuff for the core devs skip-changelog Skip adding Pull Request to changelog
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants