This bootcamp is designed to learn the basics of machine learning. It has been created by 42_AI, an association from 42 Paris.
ML Bootcamp repository (subjects): https://github.com/42-AI/bootcamp_machine-learning.
42-AI repository and presentation: https://github.com/42-AI.
This bootcamp is a "hands on" that rely on the "Supervised Machine Learning: Regression and Classification" MOOC from Stanford University. Link to the course: https://www.coursera.org/learn/machine-learning. Each module (ml_0X) is done through a day and helps discover new notions from the ML world.
- Module Objective: The goal of this module is to discover the concept of linear regression.
You will study, in this module the keys concepts in linear algebra to achieve univariate linear regression. - Notions: Sum, mean, variance, standard deviation, vectors and matrices operations. Hypothesis, model, regression, cost function.
- Module Objective: The goal of this module is to get started with the basics of linear regression.
You will study, in the field of machine learning, what we call an hypothesis, cost function, gradient descent and some notions of feature scaling. - Notions: Gradient descent, linear regression, normalization.
- Module Objective: The goal of this module is building on what you did on the previous modules.
You will extend the linear regression to handle more than one features. Then you will see how to build polynomial models and how to detect overfitting. - Notions: Multivariate linear hypothesis, multivariate linear gradient descent, polynomial models. Training and test sets, overfitting.
- Module Objective: Discover your first classification algorithm: logistic regression. You will learn its loss function, gradient descent and some metrics to evaluate its performance.
- Notions: Logistic hypothesis, logistic gradient descent, logistic regression, multiclass classification. Accuracy, precision, recall, F1-score, confusion matrix.
- Module Objective:Today you will fight overfitting! You will discover the concepts of regularization and how to implement it into the algortihms you already saw until now.
- Notions : Regularization, overfitting. Regularized loss function, regularized gradient descent. Regularized linear regression. Regularized logistic regression.
Link to my notes from these modules, courses and more : https://www.notion.so/Machine-Learning-294de562d0e94029b14716b920c87d26.