1 code implementation • 3 May 2024 • Canhui Tang, Sanping Zhou, Yizhe Li, Yonghao Dong, Le Wang
The success of knowledge distillation mainly relies on how to keep the feature discrepancy between the teacher and student model, in which it assumes that: (1) the teacher model can jointly represent two different distributions for the normal and abnormal patterns, while (2) the student model can only reconstruct the normal distribution.
no code implementations • 17 Nov 2023 • Yizhe Li, Sanping Zhou, Zheng Qin, Le Wang, Jinjun Wang, Nanning Zheng
In this paper, we propose a simple yet effective two-stage feature learning paradigm to jointly learn single-shot and multi-shot features for different targets, so as to achieve robust data association in the tracking process.
1 code implementation • ICCV 2023 • Yizhe Li, Yu-Lin Tsai, Xuebin Ren, Chia-Mu Yu, Pin-Yu Chen
Visual Prompting (VP) is an emerging and powerful technique that allows sample-efficient adaptation to downstream tasks by engineering a well-trained frozen source model.
1 code implementation • 19 Sep 2021 • Zhenhong Zou, Yizhe Li
To demonstrate the benefits of our method, we conduct various experiments on the SensatUrban dataset, in which our model presents competitive evaluation results (61. 17% mIoU and 91. 37% OverallAccuracy).
Ranked #2 on 3D Semantic Segmentation on SensatUrban