Congratulations for finishing with the lectures in our course!

NLP is a huge and rapidly emerging area. So to have an up-to-date understanding of its advances one should always keep track of what is going. In these reading material we provide some links for you that give a nice overview of NLP trends as for the end of 2017 .

First, it is always a good idea to check out highlights from main conferences. There are nicely summarized trends of ACL-2017: part 1 , part 2 . Also, some highlights from EMNLP-2017 are available here . Second, it would be a good idea to monitor some blogs, e.g. Sebastian Ruder has nice posts about DL in NLP , optimization trends , word embeddings , and many others.

One of still active topics is Thought Vectors and how one can interpret directions in the hidden space. E.g. you might be interested to check out this post . However, it's getting more clear that compressing all the input into one vector is often not enough and one might make nice things with attention and linguistic information . Some more tips about attention here .

Finally, this is another nice overview of 2017 trends in NLP research - advances in unsupervised machine translation seem especially exciting!

To conclude, we would like to say thank you for taking our course and wish best of luck in your future NLP projects!