no code implementations • CVPR 2022 • Xinke Li, Henghui Ding, Zekun Tong, Yuwei Wu, Yeow Meng Chee
Further study suggests that our strategy can improve the model performance by pretraining and fine-tuning scheme, especially for the dataset with a small scale.
1 code implementation • 3 Mar 2022 • Yongxing Dai, Yifan Sun, Jun Liu, Zekun Tong, Yi Yang, Ling-Yu Duan
Instead of directly aligning the source and target domains against each other, we propose to align the source and target domains against their intermediate domains for a smooth knowledge transfer.
1 code implementation • NeurIPS 2021 • Zekun Tong, Yuxuan Liang, Henghui Ding, Yongxing Dai, Xinke Li, Changhu Wang
However, it is still in its infancy with two concerns: 1) changing the graph structure through data augmentation to generate contrastive views may mislead the message passing scheme, as such graph changing action deprives the intrinsic graph structural information, especially the directional structure in directed graphs; 2) since GCL usually uses predefined contrastive views with hand-picking parameters, it does not take full advantage of the contrastive information provided by data augmentation, resulting in incomplete structure information for models learning.
3 code implementations • ICCV 2021 • Yongxing Dai, Jun Liu, Yifan Sun, Zekun Tong, Chi Zhang, Ling-Yu Duan
To ensure these two properties to better characterize appropriate intermediate domains, we enforce the bridge losses on intermediate domains' prediction space and feature space, and enforce a diversity loss on the two domain factors.
Domain Adaptive Person Re-Identification Person Re-Identification
no code implementations • CVPR 2021 • Yongxing Dai, Xiaotong Li, Jun Liu, Zekun Tong, Ling-Yu Duan
Specifically, we propose a decorrelation loss to make the source domain networks (experts) keep the diversity and discriminability of individual domains' characteristics.
no code implementations • ICCV 2021 • Xinke Li, Zhirui Chen, Yue Zhao, Zekun Tong, Yabang Zhao, Andrew Lim, Joey Tianyi Zhou
We present the backdoor attacks in 3D point cloud with a unified framework that exploits the unique properties of 3D data and networks.
1 code implementation • 26 Dec 2020 • Yongxing Dai, Jun Liu, Yan Bai, Zekun Tong, Ling-Yu Duan
To this end, we propose a novel approach, called Dual-Refinement, that jointly refines pseudo labels at the off-line clustering phase and features at the on-line training phase, to alternatively boost the label purity and feature discriminability in the target domain for more reliable re-ID.
1 code implementation • NeurIPS 2020 • Zekun Tong, Yuxuan Liang, Changsheng Sun, Xinke Li, David Rosenblum, Andrew Lim
Graph Convolutional Networks (GCNs) have shown promising results in modeling graph-structured data.
1 code implementation • 11 Aug 2020 • Xinke Li, Chongshou Li, Zekun Tong, Andrew Lim, Junsong Yuan, Yuwei Wu, Jing Tang, Raymond Huang
Based on it, we formulate a hierarchical learning problem for 3D point cloud segmentation and propose a measurement evaluating consistency across various hierarchies.
1 code implementation • 29 Apr 2020 • Zekun Tong, Yuxuan Liang, Changsheng Sun, David S. Rosenblum, Andrew Lim
Graph Convolutional Networks (GCNs) have been widely used due to their outstanding performance in processing graph-structured data.
1 code implementation • 5 Feb 2020 • Kun Ouyang, Yuxuan Liang, Ye Liu, Zekun Tong, Sijie Ruan, Yu Zheng, David S. Rosenblum
To tackle these issues, we develop a model entitled UrbanFM which consists of two major parts: 1) an inference network to generate fine-grained flow distributions from coarse-grained inputs that uses a feature extraction module and a novel distributional upsampling module; 2) a general fusion subnet to further boost the performance by considering the influence of different external factors.