Releases: Techtonique/nnetsauce
v0.26.0
v0.25.3
version 0.25.3
get_best_model
forLazy*
classes (see updated docs)- bring
LazyMTS
back - add Exponential Smoothing, ARIMA and Theta models to
ClassicalMTS
andLazy*MTS
- add
RandomForest
andXGBoost
toLazy*Classifier
andLazy*Regressor
as baselines - Add
MedianVotingRegressor
: using the median of predictions from an ensemble of regressors - Fix
DeepMTS
: use onlyCustomRegressor
s
v0.24.4
version 0.24.4
- Update
LazyDeepMTS
: No moreLazyMTS
class, instead, you can useLazyDeepMTS
withn_layers=1
- Specify forecasting horizon in
LazyDeepMTS
(see updated docs and examples/lazy_mts_horizon.py) - New class
ClassicalMTS
for classsical models (for now VAR and VECM adapted from statsmodels) in multivariate time series forecasting (not available in)LazyDeepMTS
yet partial_fit
forCustomClassifier
andCustomRegressor
v0.23.1
- Copula simulation in classes
MTS
andDeepMTS
- based on copulas of in-sample residuals:
vine-tll
(default),vine-bb1
,vine-bb6
,vine-bb7
,vine-bb8
,vine-clayton
,vine-frank
,vine-gaussian
,vine-gumbel
,vine-indep
,vine-joe
,vine-student
- sequential split conformal prediction (
scp
) + vine copula based on calibrated residuals:scp-vine-tll
,scp-vine-bb1
,scp-vine-bb6
,scp-vine-bb7
,scp-vine-bb8
,scp-vine-clayton
,scp-vine-frank
,scp-vine-gaussian
,scp-vine-gumbel
,scp-vine-indep
,scp-vine-joe
,scp-vine-student
- sequential split conformal prediction (
scp2
) + vine copula based on standardized calibrated residuals:scp2-vine-tll
,scp2-vine-bb1
,scp2-vine-bb6
,scp2-vine-bb7
,scp2-vine-bb8
,scp2-vine-clayton
,scp2-vine-frank
,scp2-vine-gaussian
,scp2-vine-gumbel
,scp2-vine-indep
,scp2-vine-joe
,scp2-vine-student
- based on copulas of in-sample residuals:
cross_val_score
: time series cross-validation forMTS
andDeepMTS
v0.22.7
-
Implement new types of predictive simulation intervals (parameter
type_pi
) in classMTS
: independent bootstrap, block bootstrap, 2 variants of split conformal prediction:
-gaussian
: simple, fast, but: assumes stationarity of Gaussian in-sample residuals and independence in the multivariate case
-kde
: based on Kernel Density Estimation of in-sample residuals
-bootstrap
: based on independent bootstrap of in-sample residuals
-block-bootstrap
: based on basic block bootstrap of in-sample residuals
-scp-kde
: Split conformal prediction with Kernel Density Estimation of calibrated residuals
-scp-bootstrap
: Split conformal prediction with independent bootstrap of calibrated residuals
-scp-block-bootstrap
: Split conformal prediction with basic block bootstrap of calibrated residuals
-scp2-kde
: Split conformal prediction with Kernel Density Estimation of standardized calibrated residuals
-scp2-bootstrap
: Split conformal prediction with independent bootstrap of standardized calibrated residuals
-scp2-block-bootstrap
: Split conformal prediction with basic block bootstrap of standardized calibrated residuals -
Implement Winkler score in
LazyMTS
andLazyDeepMTS
for probabilistic forecasts -
Use conformalized
Estimator
s inMTS
(seeexamples/mts_conformal_not_sims.py
) -
Include
block_size
for block bootstrapping methods for*MTS
classes
v0.20.0
v0.18.1
Prediction intervals using Bayesian inference and conformal prediction
-
Bayesian
CustomRegressor
(for sklearn'sBayesianRidge
,ARDRegressor
, andGaussianProcessRegressor
) -
Conformalized
CustomRegressor
(splitconformal
andlocalconformal
for now): see examples and notebook -
See this example, this example, and this notebook
v0.17.2
- Attribute
estimators
(a list ofEstimator
's as strings) forLazyClassifier
,
LazyRegressor
,LazyDeepClassifier
,LazyDeepRegressor
,LazyMTS
, andLazyDeepMTS
- New documentation for the package, using
pdoc
(notpdoc3
) - Remove external regressors
xreg
at inference time forMTS
andDeepMTS
- New class
Downloader
: querying the R universe API for datasets (see
https://thierrymoudiki.github.io/blog/2023/12/25/python/r/misc/mlsauce/runiverse-api2 for similar example inmlsauce
) - Add custom metric to
Lazy*
- Rename Deep regressors and classifiers to
Deep*
inLazy*
- Add attribute
sort_by
toLazy*
-- sort the data frame output by a given metric - Add attribute
classes_
to classifiers (ensure consistency with sklearn) - Add
preprocess
ing to allLazyDeep*
v0.16.5
- Improve consistency with sklearn's v1.2, for
OneHotEncoder
- add robust scaler for creating layers and clustering
- relatively faster scaling in preprocessing
- Regression-based classifiers (see https://www.researchgate.net/publication/377227280_Regression-based_machine_learning_classifiers)
DeepMTS
(multivariate time series forecasting with deep quasi-random layers): see https://thierrymoudiki.github.io/blog/2024/01/15/python/quasirandomizednn/forecasting/DeepMTS- AutoML for
MTS
(multivariate time series forecasting): see https://thierrymoudiki.github.io/blog/2023/10/29/python/quasirandomizednn/MTS-LazyPredict - AutoML for
DeepMTS
(multivariate time series forecasting): see https://github.com/Techtonique/nnetsauce/blob/master/nnetsauce/demo/thierrymoudiki_20240106_LazyDeepMTS.ipynb - Subsample continuous and discrete responses
v0.16.3
- add robust scaler
- relatively faster scaling in preprocessing
- Regression-based classifiers (see https://www.researchgate.net/publication/377227280_Regression-based_machine_learning_classifiers)
DeepMTS
(multivariate time series forecasting with deep quasi-random layers): see https://thierrymoudiki.github.io/blog/2024/01/15/python/quasirandomizednn/forecasting/DeepMTS- AutoML for
MTS
(multivariate time series forecasting): see https://thierrymoudiki.github.io/blog/2023/10/29/python/quasirandomizednn/MTS-LazyPredict - AutoML for
DeepMTS
(multivariate time series forecasting): see https://github.com/Techtonique/nnetsauce/blob/master/nnetsauce/demo/thierrymoudiki_20240106_LazyDeepMTS.ipynb - Subsample continuous and discrete responses