no code implementations • 26 Feb 2024 • Peiyan Zhang, Chaozhuo Li, Liying Kang, Feiran Huang, Senzhang Wang, Xing Xie, Sunghun Kim
Moreover, we show that existing contrastive objective learns the low-frequency component of the augmentation graph and propose a high-frequency component (HFC)-aware contrastive learning objective that makes the learned embeddings more distinctive.
no code implementations • 21 Feb 2024 • Yuchen Yan, Peiyan Zhang, Zheng Fang, Qingqing Long
Based on the insight of graph pre-training, we propose to bridge the graph signal gap and the graph structure gap with learnable prompts in the spectral space.
no code implementations • 27 Nov 2023 • Tong Zhang, Haoyang Liu, Peiyan Zhang, Yuxuan Cheng, Haohan Wang
Our method focuses on producing SVGs that are both accurate and simple, aligning with human readability and understanding.
no code implementations • 7 Nov 2023 • Peilin Zhou, Meng Cao, You-Liang Huang, Qichen Ye, Peiyan Zhang, Junling Liu, Yueqi Xie, Yining Hua, Jaeboum Kim
Large Multimodal Models (LMMs) have demonstrated impressive performance across various vision and language tasks, yet their potential applications in recommendation tasks with visual assistance remain unexplored.
no code implementations • 28 Aug 2023 • Peiyan Zhang, Yuchen Yan, Xi Zhang, Chaozhuo Li, Senzhang Wang, Feiran Huang, Sunghun Kim
Graph Neural Networks (GNNs) have emerged as promising solutions for collaborative filtering (CF) through the modeling of user-item interaction graphs.
no code implementations • 21 Aug 2023 • Peiyan Zhang, Haoyang Liu, Chaozhuo Li, Xing Xie, Sunghun Kim, Haohan Wang
Machine learning has demonstrated remarkable performance over finite datasets, yet whether the scores over the fixed benchmarks can sufficiently indicate the model's performance in the real world is still in discussion.
1 code implementation • 23 May 2023 • Peiyan Zhang, Yuchen Yan, Chaozhuo Li, Senzhang Wang, Xing Xie, Guojie Song, Sunghun Kim
Dynamic graph learning methods commonly suffer from the catastrophic forgetting problem, where knowledge learned for previous graphs is overwritten by updates for new graphs.
no code implementations • 6 Mar 2023 • Peiyan Zhang, Sunghun Kim
In this article, we offer a systematic survey of incremental update for neural recommender systems.
1 code implementation • 26 Jun 2022 • Peiyan Zhang, Jiayan Guo, Chaozhuo Li, Yueqi Xie, Jaeboum Kim, Yan Zhang, Xing Xie, Haohan Wang, Sunghun Kim
Based on this observation, we intuitively propose to remove the GNN propagation part, while the readout module will take on more responsibility in the model reasoning process.
1 code implementation • 26 Jun 2022 • Jiayan Guo, Peiyan Zhang, Chaozhuo Li, Xing Xie, Yan Zhang, Sunghun Kim
Session-based recommendation (SBR) aims to predict the user next action based on the ongoing sessions.
1 code implementation • 18 May 2022 • Juyong Jiang, Peiyan Zhang, Yingtao Luo, Chaozhuo Li, Jae Boum Kim, Kai Zhang, Senzhang Wang, Xing Xie, Sunghun Kim
Sequential recommendation (SR) aims to model users dynamic preferences from a series of interactions.
1 code implementation • 13 Dec 2021 • Juyong Jiang, Peiyan Zhang, Yingtao Luo, Chaozhuo Li, Jaeboum Kim, Kai Zhang, Senzhang Wang, Sunghun Kim
Our approach leverages bidirectional temporal augmentation and knowledge-enhanced fine-tuning to synthesize authentic pseudo-prior items that \emph{retain user preferences and capture deeper item semantic correlations}, thus boosting the model's expressive power.
no code implementations • 20 Oct 2020 • Haohan Wang, Peiyan Zhang, Eric P. Xing
Neural machine translation has achieved remarkable empirical performance over standard benchmark datasets, yet recent evidence suggests that the models can still fail easily dealing with substandard inputs such as misspelled words, To overcome this issue, we introduce a new encoding heuristic of the input symbols for character-level NLP models: it encodes the shape of each character through the images depicting the letters when printed.