This tutorial covers the basics of Deep Learning with Convolutional Neural Nets. The tutorial is broken into three notebooks. The topics covered in each notebook are:
-
Intro.ipynb:
- Linear Regression as single layer, single neuron model to motivate the introduction of Neural Networks as Universal Approximators that are modeled as collections of neurons connected in an acyclic graph
- Convolutions and examples of simple image filters to motivate the construction of Convolutionlal Neural Networks.
- Loss/Error functions, Gradient Decent, Backpropagation, etc
-
Mnist.ipynb:
- Visualizing Data
- Constructing simple Convolutional Neural Networks
- Training and Inference
- Visualizing/Interpreting trained Neural Nets
-
CIFAR-10.ipynb:
- Data Generators
- Overfitting
- Data Augmentation
Start by cloning the git repository with this folder. If you are using ALCF, see our previous tutorial's instructions.
From a terminal run the following commands:
git clone https://github.com/argonne-lcf/ai-science-training-series.git
You can run the notebooks of this session on ALCF's Jupyter-Hub.
-
Change the notebook's kernel to
conda/2021-09-22
(you may need to change kernel each time you open a notebook for the first time):- select Kernel in the menu bar
- select Change kernel...
- select conda/2021-09-22 from the drop-down menu
The code examples presented here are mostly taken (verbatim) or inspired from the following sources. I made this curation to give a quick exposure to very basic but essential ideas/practices in deep learning to get you started fairly quickly, but I recommend going to some or all of the actual sources for an in depth survey: