no code implementations • 28 Nov 2022 • Quan Feng, Jiayu Yao, Zhison Pan, Guojun Zhou
Therefore, a more realistic strategy is to leverage semi-supervised learning (SSL) with a small amount of labeled data and a large amount of unlabeled data.
no code implementations • 11 Apr 2022 • Jiayu Yao, Qingyuan Wu, Quan Feng, Songcan Chen
Self-supervised learning (SSL), as a newly emerging unsupervised representation learning paradigm, generally follows a two-stage learning pipeline: 1) learning invariant and discriminative representations with auto-annotation pretext(s), then 2) transferring the representations to assist downstream task(s).
no code implementations • 7 Jan 2022 • Quan Feng, Songcan Chen
Multi-task learning is to improve the performance of the model by transferring and exploiting common knowledge among tasks.
no code implementations • 29 Jan 2021 • Quan Feng, Songcan Chen
However, to the best of our knowledge, there is limited study on twofold heterogeneous MTL (THMTL) scenario where the input and the output spaces are both inconsistent or heterogeneous.