LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Optimizing Graph Neural Network With Multiaspect Hilbert-Schmidt Independence Criterion.

Photo by dulhiier from unsplash

The graph neural network (GNN) has demonstrated its superior power in various data mining tasks and has been widely applied in diversified fields. The core of GNN is the aggregation… Click to show full abstract

The graph neural network (GNN) has demonstrated its superior power in various data mining tasks and has been widely applied in diversified fields. The core of GNN is the aggregation and combination functions, and mainstream GNN studies focus on the enhancement of these functions. However, GNNs face a common challenge, i.e., useless features contained in neighbor nodes may be integrated into the target node during the aggregation process. This leads to poor node embedding and undermines downstream tasks. To tackle this problem, this article proposes a novel GNN optimization framework GNN-MHSIC by introducing the nonparametric dependence method Hilbert-Schmidt independence criterion (HSIC) under the guidance of information bottleneck. HSIC is utilized to guide the information propagation among layers of a GNN from multiaspect views. GNN-MHSIC aims to achieve three main objectives: 1) minimizing the HSIC between the input features and the propagation layers; 2) maximizing the HSIC between the propagation layers and the ground truth; and 3) minimizing the HSIC between the propagation layers. With a multiaspect design, GNN-MHSIC can minimize the propagation of redundant information while preserving relevant information about the target node. We prove GNN-MHSIC's finite upper and lower bounds theoretically and evaluate it experimentally with four classic GNN models, including the graph convolutional network, the graph attention network (GAT), the heterogeneous GAT, and the heterogeneous graph (HG) propagation network on three widely used HGs. The results illustrate the usefulness and performance of GNN-MHSIC.

Keywords: neural network; network; gnn; graph neural; gnn mhsic; propagation

Journal Title: IEEE transactions on neural networks and learning systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.