Hey Hey! As you know Course 5 is just out, as always, I would check out the class and give you a quick impression of what about. So far, these new 3-week class look very exciting. Here is my takeaway. Remember, I haven’t started the class yet. But this is likely to give you a sense of the scope and extent of the class.
* Course 5 is mostly focused on sequence models. That include the more mysterious models such as RNN, GRU, LSTM. You will go through standard ideas such as vanishing gradients which actually first discovered in RNN. Then go through GRU and LSTM afterward.
* The sequence of coverage is nice, covering GRU first, then LSTM doesn’t quite follow the historical order. (Hochreiter & Schmidhuber first discovered LSTM in 97, Cho had the idea about GRU in 2014). But such order makes more sense for pedagogical purpose. I don’t want to spoil it, but this is also how Socher’s approach of the subject in cs229n as well. That makes me believe this is likely a course which would teach you well on RNN.
* Week 1 will be all about RNN, then Week 2 and 3 would be about word vectors, and end-to-end structure. Would one week be enough for each topic? Not at all. But Andrew seems to give all the essential in each topic – word2vec/GloVec in word vectors. Standard dec-enc structure in end-to-end scheme. Most examples are based on SMT, which I think it’s appropriate. Other applications such as image captioning or speech recognition are possible applications. But they usually have details which is tough to cover in the first class.
* Would this class be everything you need on NLP? Very unlikely. You still need take cs229n to get good. Or even the Oxford class. But just like Course 4 is a good intro to computer vision. This will be a good intro to NLP and in general any topics which require sequence modeling such as speech recognition, stock analysis or DNA sequence analysis.
The course link can be found at https://www.coursera.org/learn/nlp-sequence-models
Hope this “Quick Impression” helps you!