Search Results for author: Xingtong Yu

Found 11 papers, 6 papers with code

Text-Free Multi-domain Graph Pre-training: Toward Graph Foundation Models

no code implementations22 May 2024 Xingtong Yu, Chang Zhou, Yuan Fang, Xinming Zhang

To address these issues, we propose MDGPT, a text free Multi-Domain Graph Pre-Training and adaptation framework designed to exploit multi-domain knowledge for graph learning.

Graph Learning

DyGPrompt: Learning Feature and Time Prompts on Dynamic Graphs

no code implementations22 May 2024 Xingtong Yu, Zhenghao Liu, Yuan Fang, Xinming Zhang

For dynamic graph modeling, dynamic graph neural networks (DGNNs) have emerged as a mainstream technique, which are generally pre-trained on the link prediction task, leaving a significant gap from the objectives of downstream tasks such as node classification.

Link Prediction Node Classification

HGPROMPT: Bridging Homogeneous and Heterogeneous Graphs for Few-shot Prompt Learning

no code implementations4 Dec 2023 Xingtong Yu, Yuan Fang, Zemin Liu, Xinming Zhang

In this paper, we propose HGPROMPT, a novel pre-training and prompting framework to unify not only pre-training and downstream tasks but also homogeneous and heterogeneous graphs via a dual-template design.

Graph Representation Learning

MultiGPrompt for Multi-Task Pre-Training and Prompting on Graphs

1 code implementation28 Nov 2023 Xingtong Yu, Chang Zhou, Yuan Fang, Xinming Zhang

Hence, in this paper, we propose MultiGPrompt, a novel multi-task pre-training and prompting framework to exploit multiple pretext tasks for more comprehensive pre-trained knowledge.

General Knowledge Graph Representation Learning

Pixel Adapter: A Graph-Based Post-Processing Approach for Scene Text Image Super-Resolution

1 code implementation16 Sep 2023 Wenyu Zhang, Xin Deng, Baojun Jia, Xingtong Yu, Yifan Chen, Jin Ma, Qing Ding, Xinming Zhang

Additionally, we introduce the MLP-based Sequential Residual Block (MSRB) for robust feature extraction from text images, and a Local Contour Awareness loss ($\mathcal{L}_{lca}$) to enhance the model's perception of details.

Graph Attention Image Super-Resolution

GraphPrompt: Unifying Pre-Training and Downstream Tasks for Graph Neural Networks

2 code implementations16 Feb 2023 Zemin Liu, Xingtong Yu, Yuan Fang, Xinming Zhang

In particular, prompting is a popular alternative to fine-tuning in natural language processing, which is designed to narrow the gap between pre-training and downstream objectives in a task-specific manner.

Graph Representation Learning

Learning to Count Isomorphisms with Graph Neural Networks

1 code implementation7 Feb 2023 Xingtong Yu, Zemin Liu, Yuan Fang, Xinming Zhang

However, typical GNNs employ a node-centric message passing scheme that receives and aggregates messages on nodes, which is inadequate in complex structure matching for isomorphism counting.

Navigate

Count-GNN: Graph Neural Networks for Subgraph Isomorphism Counting

no code implementations29 Sep 2021 Xingtong Yu, Zemin Liu, Yuan Fang, Xinming Zhang

At the graph level, we modulate the graph representation conditioned on the query subgraph, so that the model can be adapted to each unique query for better matching with the input graph.

Navigate

Cannot find the paper you are looking for? You can Submit a new open access paper.