Search Results for author: Min-Han Shih

Found 2 papers, 0 papers with code

Systematic Analysis for Pretrained Language Model Priming for Parameter-Efficient Fine-tuning

no code implementations2 Dec 2022 Shih-Cheng Huang, Shih-Heng Wang, Min-Han Shih, Saurav Sahay, Hung-Yi Lee

To tackle these issues, we propose a general PE priming framework to enhance and explore the few-shot adaptation and generalization ability of PE methods.

Domain Generalization Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.