LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

hpGAT: High-Order Proximity Informed Graph Attention Network

Photo by goumbik from unsplash

Graph neural networks (GNNs) have recently made remarkable breakthroughs in the paradigm of learning with graph-structured data. However, most existing GNNs limit the receptive field of the node on each… Click to show full abstract

Graph neural networks (GNNs) have recently made remarkable breakthroughs in the paradigm of learning with graph-structured data. However, most existing GNNs limit the receptive field of the node on each layer to its connected (one-hop) neighbors, which disregards the fact that large receptive field has been proven to be a critical factor in state-of-the-art neural networks. In this paper, we propose a novel approach to appropriately define a variable receptive field for GNNs by incorporating high-order proximity information extracted from the hierarchical topological structure of the input graph. Specifically, multiscale groups obtained from trainable hierarchical semi-nonnegative matrix factorization are used for adjusting the weights when aggregating one-hop neighbors. Integrated with the graph attention mechanism on attributes of neighboring nodes, the learnable parameters within the process of aggregation are optimized in an end-to-end manner. Extensive experiments show that the proposed method (hpGAT) outperforms state-of-the-art methods and demonstrate the importance of exploiting high-order proximity in handling noisy information of local neighborhood.

Keywords: order proximity; graph attention; high order

Journal Title: IEEE Access
Year Published: 2019

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.