- a particular suggestion for NB: initially a hyperparameter optimizer can be used to indicate the best distribution parameters. Following this a number of models can be built by modifying the following parameters: distribution types, smoothing type for the kernel density estimate, smoothing width for the kernel density estimate, priors use for the PDF estimates;
- a particular suggestion for LR: exploring Ridge and Lasso regularisation methods, comparing fitting and not fitting an intercept, or bias; optionally, could also vary the learning rate
- Compared image classification performance between Support Vector Machine (SVM) and Multi Layer Perceptron (MLP) along with a brief comparison with Convolutional Neural Network (CNN)