LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Tackling Over-Smoothing in Graph Convolutional Networks With EM-Based Joint Topology Optimization and Node Classification

Photo by goumbik from unsplash

Over-smoothing has emerged as a severe obstacle to node classification with message passing based graph convolutional networks (GCNs). Classification performance dramatically deteriorates for deep GCNs, as message passing over the… Click to show full abstract

Over-smoothing has emerged as a severe obstacle to node classification with message passing based graph convolutional networks (GCNs). Classification performance dramatically deteriorates for deep GCNs, as message passing over the observed noisy graph topology cannot adequately propagate intra-class information and over-mix the features of nodes from different communities (classes). Existing topology optimization methods for GCNs cannot sufficiently exploit the underlying ground-truth community structure to distinguish nodes from different communities. In this paper, we propose a novel method, termed EM-GCN, to address this issue by employing the Expectation Maximization (EM) algorithm to simultaneously achieve community-enhanced topology optimization and learn desirable node representations for classification. EM-GCN represents the underlying community structure with a latent adjacency matrix parameterized by an assortative-constrained stochastic block model, and consequently, explicitly enhances intra-class connection and suppresses inter-class interaction in the observed noisy graph. In the inference procedure (E-step), a graph inference model shared across all the node pairs is learned from node embeddings to approximate the posterior distribution of the latent adjacency matrix and optimize the graph topology. In the learning procedure (M-step), node representations are learned using GCNs based on the refined graph topology for the downstream classification task. EM-GCN is a general and flexible method that leverages approximate posterior and arbitrary GCNs for overcoming over-smoothing with topology optimization. Experimental results on synthetic and real-world datasets demonstrate that EM-GCN outperforms existing strategies for tackling over-smoothing and optimizing graph topology in node classification.

Keywords: topology; graph convolutional; topology optimization; classification; node classification

Journal Title: IEEE Transactions on Signal and Information Processing over Networks
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.