You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using RayDP with Spark and am using this package with Ray Tune for HyperParameter Optimization with the lightGBM regressor. Unless there is something I'm missing, there's no way to use lgbm's native cross validation as in Ray's examples, this would be a huge help to model accuracy when training large models.
The text was updated successfully, but these errors were encountered:
Alan-Penkar
changed the title
Support for CrossValidation
Support for CrossValidation: Enhancement Request
Feb 6, 2023
I am not sure whether there's a built-in way of doing that in other distributed lightgbm implementations. I think the cv function calls train underneath - as a workaround, you should be able to replace that with lightgbm-ray's train function.
Thanks for the quick response - I don't actually see lightgbm.train being called in the .cv function, but I will continue looking through the nested calls. The bigger question is how to make the _make_n_folds function within lightgbm.cv compatible with the RayDMatrix objects that are used by lightgbm_ray? I haven't seen any methods on a RayDMatrix that would lead me to believe data manipulation would be straightforward, but if I've missed it I'd certainly appreciate you pointing me in the right direction.
I am using RayDP with Spark and am using this package with Ray Tune for HyperParameter Optimization with the lightGBM regressor. Unless there is something I'm missing, there's no way to use lgbm's native cross validation as in Ray's examples, this would be a huge help to model accuracy when training large models.
The text was updated successfully, but these errors were encountered: