Nowadays, deep learning provides state-of-the-art techniques for many NLP problems. In this course, students gain a thorough introduction into the Neural Networks, RNN, Transformers. The requirements of this course are complete quizzers, practical assignments and a final project.
NLP#1: Word vector representations
Welcome to the Deep Learning in Natural Language Processing course – and to the magnificent world of artificial intelligence!
Mikhail Burtsev gives a talk about the problems of neural network architectures based on transformers (first of all, BERT and its variants) in relation to the task of language modeling, and offer research directions to overcome these problems.