Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build: switch to uv and ruff #1

Closed
wants to merge 7 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 0 additions & 20 deletions .github/workflows/black_formatting.yml

This file was deleted.

53 changes: 21 additions & 32 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,51 +8,40 @@ on:
types: [opened, synchronize, reopened, edited]
workflow_dispatch: # Allows you to run this workflow manually from the Actions tab

jobs:

Code_Quality_Check:

jobs:
test:
name: py${{ matrix.versions.python-version }} ${{ matrix.versions.resolution }}
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['3.10', 3.11]
versions:
- python-version: '3.10'
resolution: lowest-direct
- python-version: '3.11'
resolution: highest
- python-version: '3.12'
resolution: highest
steps:
- name: Checkout Repository
uses: actions/checkout@v3

- name: Setup Python ${{ matrix.python-version }}
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python-version }}
- uses: actions/checkout@v4

- name: Setup Poetry
uses: abatilo/actions[email protected]
- name: Setup Python ${{ matrix.versions.python-version }}
uses: actions/setup-python@v5
with:
poetry-version: 1.5.1
python-version: ${{ matrix.versions.python-version }}

- name: Configure Poetry Settings
shell: bash
run: python -m poetry config virtualenvs.in-project true

- name: Verify Poetry Version
run: poetry --version

- name: Install Project Dependencies
run: python -m poetry install --with dev

- name: Lint Codebase with flake8
- name: Install dependencies
run: |
python -m poetry run flake8 . --exclude .venv --count --select=E9,F63,F7,F82 --show-source --statistics
python -m poetry run flake8 . --exclude .venv --count --exit-zero --max-complexity=10 --max-line-length=79 --statistics
pip install uv
uv pip install . -r pyproject.toml --system --extra dev --resolution ${{ matrix.versions.resolution }}

- name: Execute Tests with pytest and Coverage
- name: Execute Tests
run: |
python -m poetry run coverage run -m pytest --doctest-glob="README.md"
python -m poetry run coverage report -m
python -m poetry run coverage xml
coverage run -m pytest -n auto --doctest-glob="README.md"
coverage report -m
coverage xml

- name: Upload Coverage Report to Codecov
uses: codecov/codecov-action@v3
with:
files: ./coverage.xml

27 changes: 27 additions & 0 deletions .github/workflows/formatting.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
name: Formatting

on:
pull_request:
branches: [main, develop]
types: [opened, synchronize, reopened, edited]
workflow_dispatch: # Allows you to run this workflow manually from the Actions tab


jobs:
ruff:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install Python
uses: actions/setup-python@v5
with:
python-version: "3.11"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install ruff
# Update output format to enable automatic inline annotations.
- name: Run Ruff
run: |
ruff check
ruff format --check
6 changes: 3 additions & 3 deletions .github/workflows/pull-request-linting.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,15 +5,15 @@
# Aim: ensure that PR title matches the concentional commits spec
# More info: https://github.com/marketplace/actions/semantic-pull-request

name: "Pull Request Linting"
name: "PR Linting"

on:
pull_request_target:
types: [opened, edited, synchronize]

jobs:
PR_Validation:
name: Validate Pull Request Title
validate:
name: validate conventional commit in title
runs-on: ubuntu-latest
steps:
- name: Run Semantic Pull Request Linting
Expand Down
15 changes: 6 additions & 9 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,16 +7,13 @@ repos:
- id: conventional-pre-commit
stages: [commit-msg]
args: [] # optional: list of Conventional Commits types to allow
# Black Code Formatter
- repo: https://github.com/psf/black
rev: 23.7.0
# Lint and format with ruff
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.3.4
hooks:
- id: black
# It is recommended to specify the latest version of Python
# supported by your project here, or alternatively use
# pre-commit's default_language_version, see
# https://pre-commit.com/#top_level-default_language_version
language_version: python3.11
- id: ruff
args: [ --fix ]
- id: ruff-format
# Strip output from Jupyter Notebooks
- repo: https://github.com/kynan/nbstripout
rev: 0.6.1
Expand Down
8 changes: 4 additions & 4 deletions docs/auto_examples/1single/plot_complex_eof.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,14 +2,14 @@
Complex/Hilbert EOF analysis
============================================

We demonstrate how to execute a Complex EOF (or Hilbert EOF) analysis [1]_ [2]_ [3]_.
This method extends traditional EOF analysis into the complex domain, allowing
We demonstrate how to execute a Complex EOF (or Hilbert EOF) analysis [1]_ [2]_ [3]_.
This method extends traditional EOF analysis into the complex domain, allowing
the EOF components to have real and imaginary parts. This capability can reveal
oscillatory patterns in datasets, which are common in Earth observations.
oscillatory patterns in datasets, which are common in Earth observations.
For example, beyond typical examples like seasonal cycles, you can think of
internal waves in the ocean, or the Quasi-Biennial Oscillation in the atmosphere.

Using monthly sea surface temperature data from 1970 to 2021 as an example, we
Using monthly sea surface temperature data from 1970 to 2021 as an example, we
highlight the method's key features and address edge effects as a common challenge.

.. [1] Rasmusson, E. M., Arkin, P. A., Chen, W.-Y. & Jalickee, J. B. Biennial variations in surface temperature over the United States as revealed by singular decomposition. Monthly Weather Review 109, 587–598 (1981).
Expand Down
12 changes: 6 additions & 6 deletions docs/auto_examples/1single/plot_eeof.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,12 @@
Extented EOF analysis
=====================

This example demonstrates Extended EOF (EEOF) analysis on ``xarray`` tutorial
data. EEOF analysis, also termed as Multivariate/Multichannel Singular
Spectrum Analysis, advances traditional EOF analysis to capture propagating
signals or oscillations in multivariate datasets. At its core, this
involves the formulation of a lagged covariance matrix that encapsulates
both spatial and temporal correlations. Subsequently, this matrix is
This example demonstrates Extended EOF (EEOF) analysis on ``xarray`` tutorial
data. EEOF analysis, also termed as Multivariate/Multichannel Singular
Spectrum Analysis, advances traditional EOF analysis to capture propagating
signals or oscillations in multivariate datasets. At its core, this
involves the formulation of a lagged covariance matrix that encapsulates
both spatial and temporal correlations. Subsequently, this matrix is
decomposed to yield its eigenvectors (components) and eigenvalues (explained variance).

Let's begin by setting up the required packages and fetching the data:
Expand Down
1 change: 0 additions & 1 deletion docs/auto_examples/1single/plot_eof-smode.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
EOF analysis in S-mode maximises the temporal variance.
"""


# Load packages and data:
import xarray as xr
import matplotlib.pyplot as plt
Expand Down
1 change: 1 addition & 0 deletions docs/auto_examples/1single/plot_eof-tmode.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@

Load packages and data:
"""

import xarray as xr
import matplotlib.pyplot as plt
from matplotlib.gridspec import GridSpec
Expand Down
19 changes: 10 additions & 9 deletions docs/auto_examples/1single/plot_gwpca.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,33 +3,34 @@
===========================
Geographically Weighted Principal Component Analysis (GWPCA) is a spatial analysis method that identifies and visualizes local spatial patterns and relationships in multivariate datasets across various geographic areas. It operates by applying PCA within a moving window over a geographical region, which enables the extraction of local principal components that can differ across locations.

TIn this demonstration, we'll apply GWPCA to a dataset detailing the chemical compositions of soils from countries around the Baltic Sea [1]_. This example is inspired by a tutorial originally crafted and published by Chris Brunsdon [2]_.
The dataset comprises 10 variables (chemical elements) and spans 768 samples.
TIn this demonstration, we'll apply GWPCA to a dataset detailing the chemical compositions of soils from countries around the Baltic Sea [1]_. This example is inspired by a tutorial originally crafted and published by Chris Brunsdon [2]_.
The dataset comprises 10 variables (chemical elements) and spans 768 samples.
Here, each sample refers to a pair of latitude and longitude coordinates, representing specific sampling stations.

.. [1] Reimann, C. et al. Baltic soil survey: total concentrations of major and selected trace elements in arable soils from 10 countries around the Baltic Sea. Science of The Total Environment 257, 155–170 (2000).
.. [2] https://rpubs.com/chrisbrunsdon/99675



.. note:: The dataset we're using is found in the R package
`mvoutlier <https://cran.r-project.org/web/packages/mvoutlier/mvoutlier.pdf>`_.
To access it, we'll employ the Python package
`rpy2 <https://rpy2.github.io/doc/latest/html/index.html>`_ which facilitates
interaction with R packages from within Python.
.. note:: The dataset we're using is found in the R package
`mvoutlier <https://cran.r-project.org/web/packages/mvoutlier/mvoutlier.pdf>`_.
To access it, we'll employ the Python package
`rpy2 <https://rpy2.github.io/doc/latest/html/index.html>`_ which facilitates
interaction with R packages from within Python.

.. note:: Presently, there's no support for ``xarray.Dataset`` lacking an explicit feature dimension.
.. note:: Presently, there's no support for ``xarray.Dataset`` lacking an explicit feature dimension.
As a workaround, ``xarray.DataArray.to_array`` can be used to convert the ``Dataset`` to an ``DataArray``.

.. warning:: Bear in mind that GWPCA requires significant computational power.
The ``xeofs`` implementation is optimized for CPU efficiency and is best suited
The ``xeofs`` implementation is optimized for CPU efficiency and is best suited
for smaller to medium data sets. For more extensive datasets where parallel processing becomes essential,
it's advisable to turn to the R package `GWmodel <https://cran.r-project.org/web/packages/GWmodel/GWmodel.pdf>`_.
This package harnesses CUDA to enable GPU-accelerated GWPCA for optimized performance.


Let's import the necessary packages.
"""

# For the analysis
import numpy as np
import xarray as xr
Expand Down
1 change: 0 additions & 1 deletion docs/auto_examples/1single/plot_mreof.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
Multivariate EOF analysis with additional Varimax rotation.
"""


# Load packages and data:
import xarray as xr
import matplotlib.pyplot as plt
Expand Down
1 change: 0 additions & 1 deletion docs/auto_examples/1single/plot_multivariate-eof.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
Multivariate EOF analysis.
"""


# Load packages and data:
import xarray as xr
import matplotlib.pyplot as plt
Expand Down
23 changes: 12 additions & 11 deletions docs/auto_examples/1single/plot_rotated_eof.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,25 +2,26 @@
Rotated EOF analysis
========================

EOF (Empirical Orthogonal Function) analysis is commonly used in climate science, interpreting
the derived eigenvectors (EOFs) as climatic variability patterns. However, due to
the inherent orthogonality constraint in EOF analysis, the interpretation of all
but the first EOF can be problematic. Rotated EOF analysis, using optimization criteria
like Varimax and Promax, offers a solution by releasing this orthogonality constraint,
EOF (Empirical Orthogonal Function) analysis is commonly used in climate science, interpreting
the derived eigenvectors (EOFs) as climatic variability patterns. However, due to
the inherent orthogonality constraint in EOF analysis, the interpretation of all
but the first EOF can be problematic. Rotated EOF analysis, using optimization criteria
like Varimax and Promax, offers a solution by releasing this orthogonality constraint,
thus enabling a more accurate interpretation of variability patterns.

Both Varimax (orthogonal) and Promax (oblique) rotations result in "sparse" solutions,
meaning the EOFs become more interpretable by limiting the number of variables that
contribute to each EOF. This rotation effectively serves as a regularization method
for the EOF solution, with the strength of regularization determined by the power parameter;
Both Varimax (orthogonal) and Promax (oblique) rotations result in "sparse" solutions,
meaning the EOFs become more interpretable by limiting the number of variables that
contribute to each EOF. This rotation effectively serves as a regularization method
for the EOF solution, with the strength of regularization determined by the power parameter;
the higher the value, the sparser the EOFs.

Promax rotation, with a small regularization value (i.e., power=1), reverts to Varimax
rotation. In this context, we compare the first three modes of EOF analysis: (1)
Promax rotation, with a small regularization value (i.e., power=1), reverts to Varimax
rotation. In this context, we compare the first three modes of EOF analysis: (1)
without regularization, (2) with Varimax rotation, and (3) with Promax rotation.

We'll start by loading the necessary packages and data:
"""

import xarray as xr
import matplotlib.pyplot as plt
import seaborn as sns
Expand Down
1 change: 1 addition & 0 deletions docs/auto_examples/1single/plot_weighted-eof.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@

Load packages and data:
"""

import xarray as xr
import matplotlib.pyplot as plt
import seaborn as sns
Expand Down
8 changes: 4 additions & 4 deletions docs/auto_examples/2multi/plot_cca.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@
Canonical Correlation Analysis
==============================

In this example, we're going to perform a Canonical Correlation Analysis (CCA)
on three datasets using the ERSSTv5 monthly sea surface temperature (SST) data
from 1970 to 2022. We divide this data into three areas: the Indian Ocean,
the Pacific Ocean, and the Atlantic Ocean. Our goal is to perform CCA on these
In this example, we're going to perform a Canonical Correlation Analysis (CCA)
on three datasets using the ERSSTv5 monthly sea surface temperature (SST) data
from 1970 to 2022. We divide this data into three areas: the Indian Ocean,
the Pacific Ocean, and the Atlantic Ocean. Our goal is to perform CCA on these
regions.

First, we'll import the necessary modules.
Expand Down
1 change: 0 additions & 1 deletion docs/auto_examples/2multi/plot_mca.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
Maximum Covariance Analysis (MCA) between two data sets.
"""


# Load packages and data:
import numpy as np
import xarray as xr
Expand Down
1 change: 0 additions & 1 deletion docs/auto_examples/2multi/plot_rotated_mca.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
Rotated Maximum Covariance Analysis (MCA) between two data sets.
"""


# Load packages and data:
import numpy as np
import xarray as xr
Expand Down
3 changes: 1 addition & 2 deletions docs/auto_examples/3validation/plot_bootstrap.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@
for both EOFs and PCs.
"""


# Load packages and data:
import numpy as np
import xarray as xr
Expand Down Expand Up @@ -49,7 +48,7 @@

is_significant = q025 - q975.shift({"mode": -1}) > 0
n_significant_modes = (
is_significant.where(is_significant == True).cumsum(skipna=False).max().fillna(0)
is_significant.where(is_significant is True).cumsum(skipna=False).max().fillna(0)
)
print("{:} modes are significant at alpha=0.05".format(n_significant_modes.values))

Expand Down
13 changes: 8 additions & 5 deletions docs/perf/figure_timings.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,11 +26,14 @@ def __init__(self, vmin=None, vmax=None, midpoint=None, clip=False):

def __call__(self, value, clip=None):
result, is_scalar = self.process_value(value)
x, y = [np.log(self.vmin), np.log(self.midpoint), np.log(self.vmax)], [
0,
0.5,
1,
]
x, y = (
[np.log(self.vmin), np.log(self.midpoint), np.log(self.vmax)],
[
0,
0.5,
1,
],
)
return np.ma.array(np.interp(np.log(value), x, y), mask=result.mask, copy=False)


Expand Down
Loading
Loading