no code implementations • Findings of the Association for Computational Linguistics 2020 • Wen Tai, H. T. Kung, Xin Dong, Marcus Comiter, Chang-Fu Kuo
We introduce exBERT, a training method to extend BERT pre-trained models from a general domain to a new pre-trained model for a specific domain with a new additive vocabulary under constrained training resources (i. e., constrained computation and data).