Convolutional Networks on Enhanced Message-Passing Graph Improve Semi-Supervised Classification with Few Labels

29 Sep 2021  ·  Yu Song, Shan Lu, Dehong Qiu ·

Efficient message propagation is critical to node classification in sparse graph with few labels that remains largely unaddressed until now. Recently popularized Graph Convolutional Networks (GCNs) lack the ability to propagate message to distant nodes because of over-smoothing. Besides, GCNs with numerous parameters suffer from overfitting when labeled nodes are scarce. We attack this problem via building GCNs on Enhanced Message-Passing Graph (EMPG). The key idea is node classification can benefit from various variants of the original graph that are more efficient for message propagation, based upon the assumption that each variant is a potential structure as more nodes are properly labeled. Specifically, we first map nodes to a latent space through graph embedding that captures structure information. Considering node attributes together, we construct EMPG by adding connections between nodes in close proximity in the latent space. With the help of added connections, EMPG allows a node to propagate message to the right nodes at distance, so that GCNs on EMPG need not stack multiple layers and therefore avoid over-smoothing. However, adding connections may cause message propagation saturation or lead to overfitting. Seeing EMPG as an accumulation of the potential variants of the original graph, we apply dropout to EMPG and train GCNs on various dropout graphs. The features learned from different dropout EMPGs are aggregated to compute the final prediction. Experiments demonstrate a significant improvement on node classification in sparse graph with few labels.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods