1 code implementation • EACL (Hackashop) 2021 • Blaž Škrlj, Shane Sheehan, Nika Eržen, Marko Robnik-Šikonja, Saturnino Luz, Senja Pollak
Large pretrained language models using the transformer neural network architecture are becoming a dominant methodology for many natural language processing tasks, such as question answering, text classification, word sense disambiguation, text completion and machine translation.
no code implementations • 17 Oct 2021 • Blaž Škrlj, Marko Jukič, Nika Eržen, Senja Pollak, Nada Lavrač
The COVID-19 pandemic triggered a wave of novel scientific literature that is impossible to inspect and study in a reasonable time frame manually.
1 code implementation • 12 May 2020 • Blaž Škrlj, Nika Eržen, Shane Sheehan, Saturnino Luz, Marko Robnik-Šikonja, Senja Pollak
Neural language models are becoming the prevailing methodology for the tasks of query answering, text classification, disambiguation, completion and translation.