1 code implementation • 12 Dec 2023 • Lifeng Han, Serge Gladkoff, Gleb Erofeev, Irina Sorokina, Betty Galiano, Goran Nenadic
Furthermore, to address the language resource imbalance issue, we also carry out experiments using a transfer learning methodology based on massive multilingual pre-trained language models (MMPLMs).
no code implementations • 31 Jul 2023 • Serge Gladkoff, Gleb Erofeev, Irina Sorokina, Lifeng Han, Goran Nenadic
Translation Quality Evaluation (TQE) is an essential step of the modern translation production process.
no code implementations • 12 Oct 2022 • Lifeng Han, Gleb Erofeev, Irina Sorokina, Serge Gladkoff, Goran Nenadic
To the best of our knowledge, this is the first work on using MMPLMs towards \textit{clinical domain transfer-learning NMT} successfully for totally unseen languages during pre-training.
no code implementations • 15 Sep 2022 • Lifeng Han, Gleb Erofeev, Irina Sorokina, Serge Gladkoff, Goran Nenadic
Pre-trained language models (PLMs) often take advantage of the monolingual and multilingual dataset that is freely available online to acquire general or mixed domain knowledge before deployment into specific tasks.
1 code implementation • WMT (EMNLP) 2021 • Lifeng Han, Irina Sorokina, Gleb Erofeev, Serge Gladkoff
Then we present the customised hLEPOR (cushLEPOR) which uses Optuna hyper-parameter optimisation framework to fine-tune hLEPOR weighting parameters towards better agreement to pre-trained language models (using LaBSE) regarding the exact MT language pairs that cushLEPOR is deployed to.