no code implementations • 31 Mar 2024 • Nathan Cornille, Marie-Francine Moens, Florian Mai
By training to predict the next token in an unlabeled corpus, large language models learn to perform many tasks without any labeled data.
Language Modelling Self-Supervised Learning