no code implementations • RANLP 2019 • Mansour Saffar Mehrjardi, Amine Trabelsi, Osmar R. Zaiane
Self-attentional models are a new paradigm for sequence modelling tasks which differ from common sequence modelling methods, such as recurrence-based and convolution-based sequence learning, in the way that their architecture is only based on the attention mechanism.