LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

A Simple Yet Effective Layered Loss for Pre-Training of Network Embedding

Photo from wikipedia

Pre-training of network embedding aims to encode unlabeled node proximity into a low-dimensional space, where nodes are close to their neighbors while being far from negative samples. In recent years,… Click to show full abstract

Pre-training of network embedding aims to encode unlabeled node proximity into a low-dimensional space, where nodes are close to their neighbors while being far from negative samples. In recent years, Graph Neural Networks have shown groundbreaking performance in semi-supervised learning on the node classification and link prediction tasks. However, because of their inherent information aggregation pattern, almost all these methods can only obtain inferior embedding results in the pre-training of the unlabeled nodes. The margins between a target node and its multi-hop neighbors become hard distinguishable during node message aggregation. To address this problem, we propose a simple yet effective layered loss to combine with a graph attention network, dubbed as LlossNet, for pre-training. We regard the proximity of a target node and its two-hop neighbors as a unit (called a unit graph), where a target node is needed to be more closer to its direct neighbor than its two-hop neighbors. As such, LlossNet would be able to preserve the margins of nodes in the learned embedding space. Experimental results of various downstream tasks including classification and clustering demonstrate the effectiveness of our method on learning discriminative node representations.

Keywords: network; pre training; network embedding; training network; simple yet

Journal Title: IEEE Transactions on Network Science and Engineering
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.