-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model performance under Gradient boosting #43
Comments
@sanjay-kv kindly assign me this issue and i would go through the boosting technique for [FEATURE] Add New ML Algorithms #8 |
assigned to u @SwarnenduJUIT |
Hey bhaiya ! i am data science and machine learning enthusiast.I have worked on predictive modelling. After seeing this issue i want to work on this issue.If possible then kindly assign this issue to me. |
Hi @sanjay-kv , I would like to work on this issue , please assign me this issue. |
This issue has been automatically closed because it has been inactive for more than 30 days. If you believe this is still relevant, feel free to reopen it or create a new one. Thank you! |
Is your feature request related to a problem? Please describe.
Checking for the best algorithms among the various bagging and boosting techniques such as gradient boosting,XG boosting etc..
Describe the solution you'd like
Gradient boosting trees can be more accurate than random forest because we can train them to correct each others errors.Aim to decrease bias and model will be weighted according to their performance.
Describe alternatives you've considered
Alternatively , i would more for XG boost to do the same as gradient boosting
.
What problem is this feature trying to solve?
we might check the dataset with boosting techniques to reach a conclusion.
How do we know when the feature is complete?
simply we compare for model accuracy than previous random forest model and current gradient boosting model.
The text was updated successfully, but these errors were encountered: