no code implementations • EMNLP 2020 • Husam Quteineh, Spyridon Samothrakis, Richard Sutcliffe
In this paper we propose a novel data augmentation approach where guided outputs of a language generation model, e. g. GPT-2, when labeled, can improve the performance of text classifiers through an active learning process.
no code implementations • COLING 2022 • Husam Quteineh, Spyridon Samothrakis, Richard Sutcliffe
To overcome this challenge, we present a novel approach where knowledge can be distilled from a teacher model to a student model through the generation of synthetic data.
no code implementations • 8 May 2024 • Tommaso Pasini, Alejo López-Ávila, Husam Quteineh, Gerasimos Lampouras, Jinhua Du, Yubing Wang, Ze Li, Yusen Sun
We propose a novel fine-tuning approach that prepends the rhyming word at the start of each lyric, which allows the critical rhyming decision to be made before the model commits to the content of the lyric (as during reverse language modeling), but maintains compatibility with the word order of regular PLMs as the lyric itself is still generated in left-to-right order.