Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Add Hyperparameter Tuning and the Usage of R2 Score as a Metric for Stock Price Prediction Models] <Usage of GridSearchCV/RandomSearchCV to find optimum model hyperparameters> #93

Closed
1 task done
DataWorshipper opened this issue Oct 8, 2024 · 4 comments
Assignees
Labels
enhancement New feature or request gssoc-ext GSSoC'24 Extended Version hacktoberfest Hacktober Collaboration hacktoberfest-accepted Hacktoberfest 2024 level2 25 Points 🥈(GSSoC)

Comments

@DataWorshipper
Copy link

Is this a unique feature?

  • I have checked "open" AND "closed" issues and this is not a duplicate

Is your feature request related to a problem/unavailable functionality? Please describe.

Ok so in the Stock_Price_Prediction project where the model prediction is happening i have seen that no one has use hyperparameter tuning like GridSearchCV/RandomSearchCV or even KerasTuner for hyperparamter tuning for deep learning .Just the normal base models have been used in the .ipynb files ,usage of tuning will lead to better mse and mae scores.

Proposed Solution

Tuning of hyperparameters using GridSearchCV and Keras Tuner to optimize model performance and usage of other metrics like r2 score and adjusted r2 score for better understanding of metrics cause the mse or mae is just a number while as the r2 score tends to 1 the better the model is.

Screenshots

No response

Do you want to work on this issue?

Yes

If "yes" to above, please explain how you would technically implement this (issue will not be assigned if this is skipped)

ridge=Ridge()
from sklearn.model_selection import GridSearchCV
param_grid = {'alpha': [0.001, 0.01, 0.1, 1, 10, 100, 1000]}
ridge_grid_search = GridSearchCV(ridge, param_grid, cv=5, scoring='neg_mean_squared_error')
ridge_grid_search.fit(X_train, y_train)
##This is just an example same can done with lasso regresssion ,tuning can be extensively used with tree based algorithms and instead of mse or mae i think it will be wise to use r2 score on the problem

@DataWorshipper DataWorshipper added the enhancement New feature or request label Oct 8, 2024
Copy link
Contributor

github-actions bot commented Oct 8, 2024

Ensure the issue is not similar or previously being worked on.Thanks for your time

@DataWorshipper
Copy link
Author

Dear [rohitinu6]
I want to work on this feature ,as i believe this will improve our model performance and lead to better hyperparamters when being deployed.

Thanks
Abhiraj Mandal
Contributor

@DataWorshipper
Copy link
Author

This issue was raised before as well but i dont see much work done and i also feel r2 score should be used instead of mse

@DataWorshipper DataWorshipper changed the title [Add Hyperparameter Tuning for Stock Price Prediction Models] <Usage of GridSearchCV/RandomSearchCV to find optimum model hyperparameters> [Add Hyperparameter Tuning and the Usage of R2 Score as a Metric for Stock Price Prediction Models] <Usage of GridSearchCV/RandomSearchCV to find optimum model hyperparameters> Oct 8, 2024
@rohitinu6 rohitinu6 added gssoc-ext GSSoC'24 Extended Version hacktoberfest-accepted Hacktoberfest 2024 level2 25 Points 🥈(GSSoC) hacktoberfest Hacktober Collaboration labels Oct 9, 2024
Copy link
Contributor

✅ This issue has been successfully closed. Thank you for your contribution and helping us improve the project! If you have any more ideas or run into other issues, feel free to open a new one. Happy coding! 🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request gssoc-ext GSSoC'24 Extended Version hacktoberfest Hacktober Collaboration hacktoberfest-accepted Hacktoberfest 2024 level2 25 Points 🥈(GSSoC)
Projects
None yet
Development

No branches or pull requests

3 participants