TEQnation 2021 – Sander Kerkdijk – Transforming your perception on text generation
Recurrent Neural Network (RNN) based sequence-to-sequence models have garnered a lot of traction ever since they were introduced in 2014. Most of the data in the current world are in the form of sequences – it can be a number sequence, text sequence, a video frame sequence or an audio sequence. The latest addition to it is called Attention. The addition of attention is the beginning of a new era, after Computer Vision it is now the era of Natural Language Processing. Within this lightning talk we take you on the hand to the principals of this attention model for human like text generation.
(Visited 4 times, 1 visits today)