1 code implementation • 3 May 2024 • Canhui Tang, Sanping Zhou, Yizhe Li, Yonghao Dong, Le Wang
The success of knowledge distillation mainly relies on how to keep the feature discrepancy between the teacher and student model, in which it assumes that: (1) the teacher model can jointly represent two different distributions for the normal and abnormal patterns, while (2) the student model can only reconstruct the normal distribution.
1 code implementation • 5 May 2023 • Canhui Tang, Yiheng Li, Shaoyi Du, Guofa Wang, Zhiqiang Tian
Feature Descriptors and Detectors are two main components of feature-based point cloud registration.
1 code implementation • 29 Mar 2023 • Yiheng Li, Canhui Tang, Runzhao Yao, Aixue Ye, Feng Wen, Shaoyi Du
Firstly, we propose to use salient points with prominent local features as nodes to increase patch repeatability, and introduce some uniformly distributed points to complete the point cloud, thus constituting hybrid points.