Classification Using Logistic Regression by Making a Neural Network Model. This project also includes comparison of Model performance when different regularization techniques are used
-
Updated
Jan 20, 2022 - Jupyter Notebook
Classification Using Logistic Regression by Making a Neural Network Model. This project also includes comparison of Model performance when different regularization techniques are used
Machine Learning and Data Mining Projects (2022-2023)
This work attempts to generalize a stock forecasting neural network using Bayesian regularization so that predictions can be performed without an overfitted model, considering the highly volatile market these days.
Functional Group Bridge for Simultaneous Regression and Support Estimation (https://arxiv.org/abs/2006.10163)
Regularized maximum likelihood estimation for discrete choice models on the LPMC dataset
the implementation of a multilayer perceptron
Constructing a linear regression model which may help us predict the happiness score in the 155 countries.
Code for the paper "Module-based regularization improves Gaussian graphical models when observing noisy data"
Analyzing car accident fatalities to pave the way for preventative measures and safer transportation using Statistical and Machine Learning algorithms
Digital Image Reconstruction
In linear regression, regularization is a process of making the model more regular or simpler by shrinking the model coefficient to be closer to zero or absolute, ultimately to address over fitting.
A school bootcamp for hands on learning of Machine Learning
This project focuses on developing and training supervised learning models for prediction and classification tasks, covering linear and logistic regression (using NumPy & scikit-learn), neural networks (with TensorFlow) for binary and multi-class classification, and decision trees along with ensemble methods like random forests and boosted trees
Conducted advanced research on unsupervised image segmentation using Lipschitz regularity constraints
Activities include Python basics, linear and logistic regression, cross-validation, tree-based methods, SVMs, deep learning, survival analysis, unsupervised learning, and multiple testing.
Repository with some implementations of algorithms used in Numerical Analysis. From the solution of determined and overdeterminded systems to regularization and non-linear least square problems.
This repository contains a Python implementation of linear regression, logistic regression, and ridge regression algorithms. These algorithms are commonly used in machine learning and statistical modeling for various tasks such as predicting numerical values, classifying data into categories, and handling multicollinearity in regression models.
Using deep learning to predict whether students can correctly answer diagnostic questions
The objective is to build various classification models, tune them and find the best one that will help identify failures so that the generator could be repaired before failing/breaking and the overall maintenance cost of the generators can be brought down.
Ridge Regression Work
Add a description, image, and links to the regularization-methods topic page so that developers can more easily learn about it.
To associate your repository with the regularization-methods topic, visit your repo's landing page and select "manage topics."