no code implementations • 2 May 2024 • Yongxian Wei, Zixuan Hu, Zhenyi Wang, Li Shen, Chun Yuan, DaCheng Tao
Data-Free Meta-Learning (DFML) aims to extract knowledge from a collection of pre-trained models without requiring the original data, presenting practical benefits in contexts constrained by data privacy concerns.
no code implementations • 23 Nov 2023 • Zixuan Hu, Li Shen, Zhenyi Wang, Yongxian Wei, Baoyuan Wu, Chun Yuan, DaCheng Tao
TDS leads to a biased meta-learner because of the skewed task distribution towards newly generated tasks.
1 code implementation • ICCV 2023 • Xiaotong Li, Zixuan Hu, Yixiao Ge, Ying Shan, Ling-Yu Duan
The experimental results on 10 downstream tasks and 12 self-supervised models demonstrate that our approach can seamlessly integrate into existing ranking techniques and enhance their performances, revealing its effectiveness for the model selection task and its potential for understanding the mechanism in transfer learning.
1 code implementation • 28 May 2023 • Zixuan Hu, Li Shen, Zhenyi Wang, Baoyuan Wu, Chun Yuan, DaCheng Tao
Data-free meta-learning (DFML) aims to enable efficient learning of new tasks by meta-learning from a collection of pre-trained models without access to the training data.
1 code implementation • CVPR 2023 • Zixuan Hu, Li Shen, Zhenyi Wang, Tongliang Liu, Chun Yuan, DaCheng Tao
The goal of data-free meta-learning is to learn useful prior knowledge from a collection of pre-trained models without accessing their training data.
1 code implementation • 16 Jan 2023 • Xiaotong Li, Zixuan Hu, Jun Liu, Yixiao Ge, Yongxing Dai, Ling-Yu Duan
In this paper, we improve the network generalization ability by modeling domain shifts with uncertainty (DSU), i. e., characterizing the feature statistics as uncertain distributions during training.
1 code implementation • 29 Mar 2022 • Xiaotong Li, Yixiao Ge, Kun Yi, Zixuan Hu, Ying Shan, Ling-Yu Duan
Image BERT pre-training with masked image modeling (MIM) becomes a popular practice to cope with self-supervised representation learning.
1 code implementation • 30 Jul 2020 • Yuchen Wang, Zixuan Hu, Barry C. Sanders, Sabre Kais
Qudit is a multi-level computational unit alternative to the conventional 2-level qubit.
Quantum Physics