Changes:
- add support for QR solver, preview of GLM.jl version 2.
- add support for the
dropcollinear
keyword argument, following GLM.jl version 1.9. Setting it totrue
improves solving problems where the model matrix is not full rank.
Dependencies and docs:
- minimal Julia version bumped to 1.6
- use JuliaFormatter.jl
- add typos checks
Tests:
- improve tests output
Breaking changes:
- remove TableRegressionModel wrapper, following GLM.jl [#31]
Other changes:
- Add loss functions (CatoniNarrowLoss, CatoniWideLoss, HardThresholdLoss, HampelLoss)
- Add the
wobs
function to use instead ofnobs
, take the weights into account.nobs
return anInt
, the number of non-zeros weights or length(response(m)) without weights. - Improve parameter changes with
refit!
- Improve weights (wts) usage
- RidgePred correct various functions (dof, stderror, ...)
- PredCG: improve perf
- Add GLM.DensePredQR
Dependencies and docs:
- Improve loss functions documentation
- Reformat code, create new files (tools.jl, losses.jl, regularizedpred.jl)
- Update dependencies compat versions (StatsBase-v0.34, StatsModels-v0.7)
- Add dependencies (Missings-v1, StatsAPI-v1.3, Tables-v1)
Tests:
- Tests: more systematic tests
- Tests: add exact Ridge test.
- Tests: add weights test
Bugfixes:
- Fix missing type leading to StackOverflow [#17]
- Fix infinite loop [#33]
- Update dependencies compat versions (Roots)
- Export
hasintercept
function - Correct
nulldeviance
andnullloglikelihood
for models without intercept (JuliaStats/StatsAPI.jl#14). - Update dependencies compat versions (Tulip)
- Add dependencies compat versions
- Register package
- Minimal compatibility set to julia 1.3 (because of Tulip.jl>=0.8)
- Correctly handle multidimensional arrays with univariate robust functions.
- Correct code formatting.
- Drop the heavy
JuMP
dependency and useTulip
with the unstable internal API instead. - Add univariate robust functions:
mean
,std
,var
,sem
,mean_and_std
,mean_and_var
andmean_and_sem
. - Small bug corrections.
- BREAKING: Implement the loss functions as subclasses of
LossFunction
and estimators as subclasses ofAbstractEstimator
. Thekind
keyword argument is not used anymore, instead userlm(form, data, MMEstimator{TukeyLoss}(); initial_scale=:L1)
- Implement Robust Ridge regression by using the keyword argument
ridgeλ
(andridgeG
andβprior
for more general penalty). - Add documentation.
- τ-Estimator
- New estimator function: optimal Yohai-Zamar estimator
- Resampling algorithm to find the global minimum of S- and τ-estimators.
First public release:
- M-Estimator
- S-Estimator
- MM-Estimator
- M-Quantile (Expectile, etc...)
- Quantile regression with interior point