SIMGA: A Simple and Effective Heterophilous Graph Neural Network with Efficient Global Aggregation

17 May 2023  ·  Haoyu Liu, Ningyi Liao, Siqiang Luo ·

Graph neural networks (GNNs) realize great success in graph learning but suffer from performance loss when meeting heterophily, i.e. neighboring nodes are dissimilar, due to their local and uniform aggregation. Existing attempts in incoorporating global aggregation for heterophilous GNNs usually require iteratively maintaining and updating full-graph information, which entails $\mathcal{O}(n^2)$ computation efficiency for a graph with $n$ nodes, leading to weak scalability to large graphs. In this paper, we propose SIMGA, a GNN structure integrating SimRank structural similarity measurement as global aggregation. The design of SIMGA is simple, yet it leads to promising results in both efficiency and effectiveness. The simplicity of SIMGA makes it the first heterophilous GNN model that can achieve a propagation efficiency near-linear to $n$. We theoretically demonstrate its effectiveness by treating SimRank as a new interpretation of GNN and prove that the aggregated node representation matrix has expected grouping effect. The performances of SIMGA are evaluated with 11 baselines on 12 benchmark datasets, usually achieving superior accuracy compared with the state-of-the-art models. Efficiency study reveals that SIMGA is up to 5$\times$ faster than the state-of-the-art method on the largest heterophily dataset pokec with over 30 million edges.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here