1 code implementation • 14 Mar 2024 • Kaichao You, Runsheng Bai, Meng Cao, Jianmin Wang, Ion Stoica, Mingsheng Long
PyTorch \texttt{2. x} introduces a compiler designed to accelerate deep learning programs.
1 code implementation • 19 May 2023 • Kaichao You, Guo Qin, Anchang Bao, Meng Cao, Ping Huang, Jiulong Shan, Mingsheng Long
Subsequently, we propose a novel Tune mode to bridge the gap between Eval mode and Deploy mode.
no code implementations • 19 Aug 2022 • Song Wu, Kaichao You, Weihua He, Chen Yang, Yang Tian, Yaoyuan Wang, Ziyang Zhang, Jianxing Liao
In this paper, we propose an end-to-end training method A^2OF for video frame interpolation with event-driven Anisotropic Adjustment of Optical Flows.
no code implementations • 19 Apr 2022 • Xiaofei Ge, Kaichao You, Zeren Tan, Hedong Hou, Yang Tian, Pei Sun
We anticipate our approach to be a general formalism to portray RNA virus evolution and help identify potential virus lineages to be concerned.
no code implementations • CVPR 2022 • Weihua He, Kaichao You, Zhendong Qiao, Xu Jia, Ziyang Zhang, Wenhui Wang, Huchuan Lu, Yaoyuan Wang, Jianxing Liao
Since event camera is a novel sensor, its potential has not been fulfilled due to the lack of processing algorithms.
1 code implementation • 14 Mar 2022 • Zhangjie Cao, Kaichao You, Ziyang Zhang, Jianmin Wang, Mingsheng Long
Still, the common requirement of identical class space shared across domains hinders applications of domain adaptation to partial-set domains.
1 code implementation • 20 Oct 2021 • Kaichao You, Yong liu, Ziyang Zhang, Jianmin Wang, Michael I. Jordan, Mingsheng Long
(2) The best ranked PTM can either be fine-tuned and deployed if we have no preference for the model's architecture or the target PTM can be tuned by the top $K$ ranked PTMs via a Bayesian procedure that we propose.
1 code implementation • 29 Jul 2021 • Jiayi Weng, Huayu Chen, Dong Yan, Kaichao You, Alexis Duburcq, Minghao Zhang, Yi Su, Hang Su, Jun Zhu
In this paper, we present Tianshou, a highly modularized Python library for deep reinforcement learning (DRL) that uses PyTorch as its backend.
1 code implementation • 22 Feb 2021 • Kaichao You, Yong liu, Jianmin Wang, Mingsheng Long
In pursuit of a practical assessment method, we propose to estimate the maximum value of label evidence given features extracted by pre-trained models.
Ranked #3 on Transferability on classification benchmark
2 code implementations • NeurIPS 2020 • Zhi Kou, Kaichao You, Mingsheng Long, Jianmin Wang
During training, two branches are stochastically selected to avoid over-depending on some sample statistics, resulting in a strong regularization effect, which we interpret as ``architecture regularization.''
2 code implementations • NeurIPS 2020 • Kaichao You, Zhi Kou, Mingsheng Long, Jianmin Wang
Fine-tuning pre-trained deep neural networks (DNNs) to a target dataset, also known as transfer learning, is widely used in computer vision and NLP.
Ranked #1 on Transfer Learning on COCO70
no code implementations • ICLR 2020 • Kaichao You, Mingsheng Long, Jian-Min Wang, Michael. I. Jordan
Despite the popularity of these common beliefs, experiments suggest that they are insufficient in explaining the general effectiveness of lrDecay in training modern neural networks that are deep, wide, and nonconvex.
2 code implementations • International Conference on Machine Learning 2019 • Kaichao You, Ximei Wang, Mingsheng Long, Michael Jordan
Deep unsupervised domain adaptation (Deep UDA) methods successfully leverage rich labeled data in a source domain to boost the performance on related but unlabeled data in a target domain.
no code implementations • CVPR 2019 • Kaichao You, Mingsheng Long, Zhangjie Cao, Jianmin Wang, Michael I. Jordan
This paper introduces Universal Domain Adaptation (UDA) that requires no prior knowledge on the label sets.
Ranked #10 on Universal Domain Adaptation on DomainNet
1 code implementation • CVPR 2019 • Zhangjie Cao, Kaichao You, Mingsheng Long, Jian-Min Wang, Qiang Yang
Under the condition that target labels are unknown, the key challenge of PDA is how to transfer relevant examples in the shared classes to promote positive transfer, and ignore irrelevant ones in the specific classes to mitigate negative transfer.
Ranked #4 on Partial Domain Adaptation on ImageNet-Caltech