no code implementations • 19 May 2023 • Yiduo Guo, Yaobo Liang, Dongyan Zhao, Bing Liu, Duan Nan
Existing research has shown that a multilingual pre-trained language model fine-tuned with one (source) language also performs well on downstream tasks for non-source languages, even though no fine-tuning is done on these languages.