Search Results for author: Nayoung Lee

Found 2 papers, 2 papers with code

Can Mamba Learn How to Learn? A Comparative Study on In-Context Learning Tasks

2 code implementations6 Feb 2024 Jongho Park, Jaeseung Park, Zheyang Xiong, Nayoung Lee, Jaewoong Cho, Samet Oymak, Kangwook Lee, Dimitris Papailiopoulos

State-space models (SSMs), such as Mamba (Gu & Dao, 2023), have been proposed as alternatives to Transformer networks in language modeling, by incorporating gating, convolutions, and input-dependent token selection to mitigate the quadratic cost of multi-head attention.

In-Context Learning Language Modelling +1

Teaching Arithmetic to Small Transformers

1 code implementation7 Jul 2023 Nayoung Lee, Kartik Sreenivasan, Jason D. Lee, Kangwook Lee, Dimitris Papailiopoulos

Even in the complete absence of pretraining, this approach significantly and simultaneously improves accuracy, sample complexity, and convergence speed.

Low-Rank Matrix Completion

Cannot find the paper you are looking for? You can Submit a new open access paper.