LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Graph Attention Networks With Local Structure Awareness for Knowledge Graph Completion

Photo by goumbik from unsplash

Graph neural networks have been proven to be very effective for representation learning of knowledge graphs. Recent methods such as SACN and CompGCN, have achieved the most advanced results in… Click to show full abstract

Graph neural networks have been proven to be very effective for representation learning of knowledge graphs. Recent methods such as SACN and CompGCN, have achieved the most advanced results in knowledge graph completion. However, previous efforts mostly rely on localized first-order approximations of spectral graph convolutions or first-order neighborhoods, ignoring the abundant local structures like cycles and stars. Therefore, the diverse semantic information beneath these structures is not well-captured, leaving opportunities for better knowledge representation which will finally help KGC. In this work, we propose LSA-GAT, a graph attention network with a novel neighborhood aggregation strategy for knowledge graph completion. The model can take special local structures into account, and derive a sophisticated representation covering both the semantic and structural information. Moreover, the LSA-GAT model is combined with a CNN-based decoder to form an encoder-decoder framework with a carefully designed training process. The experimental results show significant improvement of the proposed LSA-GAT compared to current state-of-the-art methods on FB15k-237 and WN18RR datasets.

Keywords: knowledge graph; graph attention; graph completion; graph; knowledge

Journal Title: IEEE Access
Year Published: 2020

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.