Search Results for author: Xinduo Liu

Found 1 papers, 0 papers with code

Data-Free Distillation of Language Model by Text-to-Text Transfer

no code implementations3 Nov 2023 Zheyuan Bai, Xinduo Liu, Hailin Hu, Tianyu Guo, Qinghua Zhang, Yunhe Wang

Data-Free Knowledge Distillation (DFKD) plays a vital role in compressing the model when original training data is unavailable.

Data-free Knowledge Distillation Language Modelling +4

Cannot find the paper you are looking for? You can Submit a new open access paper.