no code implementations • 17 Jan 2024 • Renchunzi Xie, Ambroise Odonnat, Vasilii Feofanov, Ievgen Redko, Jianfeng Zhang, Bo An
Our key idea is that the model should be adjusted with a higher magnitude of gradients when it does not generalize to the test dataset with a distribution shift.
no code implementations • 8 Dec 2022 • Hongxin Wei, Huiping Zhuang, Renchunzi Xie, Lei Feng, Gang Niu, Bo An, Yixuan Li
In the presence of noisy labels, designing robust loss functions is critical for securing the generalization performance of deep neural networks.
3 code implementations • 17 Jun 2022 • Hongxin Wei, Lue Tao, Renchunzi Xie, Lei Feng, Bo An
Deep neural networks usually perform poorly when the training dataset suffers from extreme class imbalance.
1 code implementation • 30 May 2022 • Huiping Zhuang, Zhenyu Weng, Hongxin Wei, Renchunzi Xie, Kar-Ann Toh, Zhiping Lin
Class-incremental learning (CIL) learns a classification model with training data of different classes arising progressively.
2 code implementations • 19 May 2022 • Hongxin Wei, Renchunzi Xie, Hao Cheng, Lei Feng, Bo An, Yixuan Li
Our method is motivated by the analysis that the norm of the logit keeps increasing during training, leading to overconfident output.
3 code implementations • 16 Jan 2022 • Renchunzi Xie, Hongxin Wei, Lei Feng, Bo An
Although there have been a few studies on this problem, most of them only exploit unidirectional relationships from the source domain to the target domain.
no code implementations • 29 Sep 2021 • Hongxin Wei, Lue Tao, Renchunzi Xie, Lei Feng, Bo An
Deep neural networks usually perform poorly when the training dataset suffers from extreme class imbalance.
1 code implementation • 5 Sep 2021 • Renchunzi Xie, Mahardhika Pratama
Knowledge transfer across several streaming processes remain challenging problem not only because of different distributions of each stream but also because of rapidly changing and never-ending environments of data streams.
4 code implementations • NeurIPS 2021 • Hongxin Wei, Lue Tao, Renchunzi Xie, Bo An
Learning with noisy labels is a practically challenging problem in weakly supervised learning.
2 code implementations • 8 Oct 2019 • Mahardhika Pratama, Marcus de Carvalho, Renchunzi Xie, Edwin Lughofer, Jie Lu
It automatically evolves its network structure from scratch with/without the presence of ground truth to overcome independent concept drifts in the source and target domain.