RNN(SimpleRNN, LSTM, GRU) Tensorflow2.0 & Keras Notebooks (Workshop materials)
Some parts are freely available from our Aparat channel or you can purchase a full package including 32 videos in Persian from class.vision
02_1_simple-RNN-diffrent-sequence-length.ipynb
02_2_simple-RNN-diffrent-sequence-length.ipynb
- when we use return_sequences=True ?
- Stacked RNN (Deep RNN)
- using a LSTM layer
03_1_Cryptocurrency-predicting.ipynb
03_2_Cryptocurrency-predicting.ipynb
- what is TimeDistributed layer in Keras?
- Introduction to video classification
- CNN + LSTM
- How using pre-trained CNN as a feature extracture for RNN
- using GRU layer
05-1-video-action-recognition-train-extract-features-with-cnn
05-2_video-action-recognition-train-rnn.ipynb
- Using Glove
- Cosine Similarity
- Analogy
06_analogy-using-embeddings.ipynb
- What is Bag of Embeddings?
- Using Embedding Layer in keras
- Set embedding layer with pre-trained embedding
- Using RNN for NLP Tasks
07_text-classification-Emojify.ipynb
- what is TF Dataset
- Stateful VS Stateless
- When we need batch_input_shape ?
08_shahnameh-text-generation-language-model.ipynb
- using RepeatVector for connecting encoder to decoder
- use encoder hidden state as an input decoder
09_add-numbers-with-seq2seq.ipynb
10_Neural-machine-translation-with-attention-for-date-convert.ipynb
- Teacher forcing
- Loss with Mask for zero padding!
- Using Model-Subclassing