no code implementations • EACL (AdaptNLP) 2021 • Jezabel Garcia, Federica Freddi, Jamie McGowan, Tim Nieradzik, Feng-Ting Liao, Ye Tian, Da-Shan Shiu, Alberto Bernacchia
In meta-learning, the knowledge learned from previous tasks is transferred to new ones, but this transfer only works if tasks are related.
Cross-Lingual Natural Language Inference Cross-Lingual Transfer +1
no code implementations • 5 Mar 2024 • Chan-Jan Hsu, Chang-Le Liu, Feng-Ting Liao, Po-chun Hsu, Yi-Chang Chen, Da-Shan Shiu
Breeze-7B is an open-source language model based on Mistral-7B, designed to address the need for improved language comprehension and chatbot-oriented capabilities in Traditional Chinese.
1 code implementation • 15 Sep 2023 • Chan-Jan Hsu, Chang-Le Liu, Feng-Ting Liao, Po-chun Hsu, Yi-Chang Chen, Da-Shan Shiu
In an effort to advance the evaluation of language models in Traditional Chinese and stimulate further research in this field, we have open-sourced our benchmark and opened the model for trial.
1 code implementation • 18 Jul 2023 • Feng-Ting Liao, Yung-Chieh Chan, Yi-Chang Chen, Chan-Jan Hsu, Da-Shan Shiu
In this work, we propose a method to create domain-sensitive speech recognition models that utilize textual domain information by conditioning its generation on a given text prompt.
no code implementations • 8 Mar 2021 • Jezabel R. Garcia, Federica Freddi, Feng-Ting Liao, Jamie McGowan, Tim Nieradzik, Da-Shan Shiu, Ye Tian, Alberto Bernacchia
We show that TreeMAML improves the state of the art results for cross-lingual Natural Language Inference.
Cross-Lingual Natural Language Inference Cross-Lingual Transfer +2