Sequence Models repository for all projects and programming assignments of Course 5 of 5 of the Deep Learning Specialization offered on Coursera and taught by Andrew Ng, covering topics such as Recurrent Neural Network (RNN), Gated Recurrent Unit (GRU), Long Short Term Memory (LSTM), Natural Language Processing, Word Representations and Embeddings and Attention Model.
I loved implementing cool applications including Character Level Language Modeling, Text and Music generation, Sentiment Classification, Debiasing Word Embeddings, Speech Recognition and Trigger Word Detection. I had a wonderful time using the Google Cloud Platform (GCP) and Deep Learning Frameworks Keras and Tensorflow.
RNN, GRU, LSTM, NLP, Sequence-Models, see Papers
- 2014 GRU On the Properties of Neural Machine Translation Encoder-Decoder Approaches - Cho, Merrienboer, Bahdanau, Bengio
- 2014 GRU Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling - Chung, Gulcehre, Cho, Bengio
- 1997 LSTM LONG SHORT-TERM MEMORY - Sepp Hochreiter, Jurgen Schmidhuber
- 2008 t-SNE Visualizing Data using t-SNE - Laurens van der Maaten, Geoffrey Hinton
- 2013 Linguistic Regularities in Continuous Space Word Representations - Mikolov, Yih, Zweig
- 2003 A Neural Probabilistic Language Model - Bengio, Ducharme, Vincent, Jauvin
- 2013 Word2Vec CBOW Skip-gram Efficient Estimation of Word Representations in Vector Space - Mikolov, Chen, Corrado, Dean
- 2013 Word2Vec Negative Sampling Distributed Representations of Words and Phrases and their Compositionality - Mikolov, Sutskever, Chen, Corrado, Dean
- 2014 GloVe - Global Vectors for Word Representation - Pennington, Socher, Manning
- 2016 Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embedding - Bolukbasi, Chang, Zou, Saligrama, Kalai
- 2014 Sequence to Sequence Learning with Neural Networks - Sutskever, Vinyals, Le
- 2014 Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation - Cho, Merrienboer, Gulcehre, Bahdanau, Bougares, Schwenk, Bengio
- 2015 Deep Captioning with Multimodal Recurrent Neural Networks (m-RNN) - Mao, Xu, Yang, Wang, Huang, Yuille
- 2014 Show and Tell - A Neural Image Caption Generator - Vinyals, Toshev, Bengio, Erhan
- 2015 Deep Visual-Semantic Alignments for Generating Image Descriptions - Karpathy, Li
- 2002 BLEU bilingual evaluation understudy - a Method for Automatic Evaluation of Machine Translation - Papineni, Roukos, Ward, Zhu
- 2016 Attention Model Neural Machine Translation by Jointly Learning to Align and Translate - Bahdanau, Cho, Bengio
- 2006 CTC Connectionist Temporal Classification - Labelling Unsegmented Sequence Data with Recurrent Neural Networks - Graves, Fernandez, Gomez, Schmidhuber
- 2014 DeepFace- Closing the Gap to Human-Level Performance in Face Verification - Taigman, Yang, Ranzato, Wolf
- 2016 Show, Attend and Tell - Neural Image Caption Generation with Visual Attention - Xu, Ba, Kiros, Cho, Courville, Salakhutdinov, Zemel, Bengio
Recurrent Neural Networks and Long Short Term Memory Networks Resources
- The Unreasonable Effectiveness of Recurrent Neural Networks
http://karpathy.github.io/2015/05/21/rnn-effectiveness/ - Understanding LSTM Networks
https://colah.github.io/posts/2015-08-Understanding-LSTMs/
Word2Vec Tutorials and Resources
- Word2Vec Tutorials
http://mccormickml.com/tutorials/ - Word2Vec Tutorial - The Skip-Gram Model
http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/ - Word2Vec Tutorial Part 2 - Negative Sampling
http://mccormickml.com/2017/01/11/word2vec-tutorial-part-2-negative-sampling/ - Applying word2vec to Recommenders and Advertising
http://mccormickml.com/2018/06/15/applying-word2vec-to-recommenders-and-advertising/ - Word2Vec Resources
http://mccormickml.com/2016/04/27/word2vec-resources/ - Google Code Word2Vec
https://code.google.com/archive/p/word2vec/ - Tensorflow Word2Vec
https://www.tensorflow.org/tutorials/representation/word2vec - On word embeddings
http://ruder.io/word-embeddings-1/index.html - Guide to Word2Vec and Neural Word Embeddings
https://skymind.ai/wiki/word2vec - Book Recommendation System
https://github.com/WillKoehrsen/wikipedia-data-science/blob/master/notebooks/Book%20Recommendation%20System.ipynb - Bag of Words Meets Bags of Popcorn
https://www.kaggle.com/c/word2vec-nlp-tutorial/ - Stanford Deep Learning Tutorial
http://ufldl.stanford.edu/tutorial/ - Stanford Lecture 2 | Word Vector Representations: word2vec
https://www.youtube.com/watch?v=ERibwqs9p38 - Deep Learning | Udacity
https://www.youtube.com/playlist?list=PLAwxTw4SYaPn_OWPFT9ulXLuQrImzHfOV - Deep Learning, NLP, and Representations
http://colah.github.io/posts/2014-07-NLP-RNNs-Representations/ - GTC 2015 Keynote with Jeff Dean, Google
https://www.ustream.tv/recorded/60071572 - GTC 2015 Keynote with Dr. Andrew Ng, Baidu
https://www.ustream.tv/recorded/60113824
https://www.coursera.org/learn/nlp-sequence-models
https://www.coursera.org/specializations/deep-learning
https://www.deeplearning.ai
https://www.coursera.org/account/accomplishments/verify/PA3E5G7YQXNM
https://www.coursera.org/account/accomplishments/specialization/TYHX7MGWHFGT
https://www.youtube.com/watch?v=ggQ1y1UHOvc