1 code implementation • NAACL (ACL) 2022 • Seung Byum Seo, Hyoungwook Nam, Payam Delgosha
While there have been advances in Natural Language Processing (NLP), their success is mainly gained by applying a self-attention mechanism into single or multi-modalities.
no code implementations • 18 Feb 2023 • Hyoungwook Nam, Seung Byum Seo
We propose a novel perspective of the attention mechanism by reinventing it as a memory architecture for neural networks, namely Neural Attention Memory (NAM).
1 code implementation • 18 Jun 2020 • Hyoungwook Nam, Seung Byum Seo, Vikram Sharma Mailthody, Noor Michael, Lan Li
The model inductively generalizes on a variety of algorithmic tasks where state-of-the-art Transformer models fail to do so.