no code implementations • ICML 2020 • Reza Oftadeh, Jiayi Shen, Zhangyang Wang, Dylan Shell
For this new loss, we characterize the full structure of the loss landscape in the following sense: we establish analytical expression for the set of all critical points, show that it is a subset of critical points of MSE, and that all local minima are still global.
1 code implementation • 9 Apr 2024 • Jiayi Shen, Cheems Wang, Zehao Xiao, Nanne van Noord, Marcel Worring
This paper proposes \textit{GO4Align}, a multi-task optimization approach that tackles task imbalance by explicitly aligning the optimization across tasks.
no code implementations • 15 Feb 2024 • Zehao Xiao, Jiayi Shen, Mohammad Mahdi Derakhshani, Shengcai Liao, Cees G. M. Snoek
To effectively encode the distribution information and their relationships, we further introduce a transformer inference network with a pseudo-shift training mechanism.
1 code implementation • NeurIPS 2023 • Jiayi Shen, XianTong Zhen, QI, Wang, Marcel Worring
This paper focuses on the data-insufficiency problem in multi-task learning within an episodic training setup.
no code implementations • 22 Sep 2023 • Shuai Wang, Jiayi Shen, Athanasios Efthymiou, Stevan Rudinac, Monika Kackovic, Nachoem Wijnberg, Marcel Worring
The variety and complexity of relations in multimedia data lead to Heterogeneous Information Networks (HINs).
no code implementations • 8 Jul 2023 • Sameer Ambekar, Zehao Xiao, Jiayi Shen, XianTong Zhen, Cees G. M. Snoek
We formulate the generalization at test time as a variational inference problem by modeling pseudo labels as distributions to consider the uncertainty during generalization and alleviate the misleading signal of inaccurate pseudo labels.
no code implementations • 8 Jun 2023 • Yingjun Du, Jiayi Shen, XianTong Zhen, Cees G. M. Snoek
By learning to retain and recall the learning process of past training tasks, EMO nudges parameter updates in the right direction, even when the gradients provided by a limited number of examples are uninformative.
no code implementations • CVPR 2023 • Yingjun Du, Jiayi Shen, XianTong Zhen, Cees G. M. Snoek
Modern image classifiers perform well on populated classes, while degrading considerably on tail classes with only a few instances.
1 code implementation • 10 Oct 2022 • Jiayi Shen, Zehao Xiao, XianTong Zhen, Cees G. M. Snoek, Marcel Worring
To generalize to such test data, it is crucial for individual tasks to leverage knowledge from related tasks.
1 code implementation • 5 Jun 2022 • Zhenyu Hu, Zhenyu Wu, Pengcheng Pi, Yunhe Xue, Jiayi Shen, Jianchao Tan, Xiangru Lian, Zhangyang Wang, Ji Liu
Unmanned Aerial Vehicles (UAVs) based video text spotting has been extensively used in civil and military domains.
1 code implementation • CVPR 2022 • Haochen Wang, Jiayi Shen, Yongtuo Liu, Yan Gao, Efstratios Gavves
To tackle this issue, we propose a Neighbor Transformer Network, or NFormer, which explicitly models interactions across all input images, thus suppressing outlier features and leading to more robust representations overall.
1 code implementation • ICLR 2022 • Shixing Yu, Tianlong Chen, Jiayi Shen, Huan Yuan, Jianchao Tan, Sen yang, Ji Liu, Zhangyang Wang
Vision transformers (ViTs) have gained popularity recently.
no code implementations • 10 Nov 2021 • Jiayi Shen, XianTong Zhen, Marcel Worring, Ling Shao
Our multi-task neural processes methodologically expand the scope of vanilla neural processes and provide a new way of exploring task relatedness in function spaces for multi-task learning.
1 code implementation • NeurIPS 2021 • Jiayi Shen, XianTong Zhen, Marcel Worring, Ling Shao
Multi-task learning aims to explore task relatedness to improve individual tasks, which is of particular significance in the challenging scenario that only limited data is available for each task.
1 code implementation • 9 May 2021 • Zehao Xiao, Jiayi Shen, XianTong Zhen, Ling Shao, Cees G. M. Snoek
Domain generalization is challenging due to the domain shift and the uncertainty caused by the inaccessibility of target domain data.
no code implementations • 1 Jan 2021 • Jiayi Shen, XianTong Zhen, Marcel Worring, Ling Shao
Multi-task learning aims to improve the overall performance of a set of tasks by leveraging their relatedness.
no code implementations • 1 Jan 2021 • Zehao Xiao, Jiayi Shen, XianTong Zhen, Ling Shao, Cees G. M. Snoek
In the probabilistic modeling framework, we introduce a domain-invariant principle to explore invariance across domains in a unified way.
no code implementations • ICLR 2021 • Jiayi Shen, Xiaohan Chen, Howard Heaton, Tianlong Chen, Jialin Liu, Wotao Yin, Zhangyang Wang
We first present Twin L2O, the first dedicated minimax L2O framework consisting of two LSTMs for updating min and max variables, respectively.
no code implementations • ICLR 2021 • Jiayi Shen, Haotao Wang, Shupeng Gui, Jianchao Tan, Zhangyang Wang, Ji Liu
The recommendation system (RS) plays an important role in the content recommendation and retrieval scenarios.
no code implementations • 25 Sep 2019 • Reza Oftadeh, Jiayi Shen, Zhangyang Wang, Dylan Shell
In this paper, we propose a new loss function for performing principal component analysis (PCA) using linear autoencoders (LAEs).