LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Signed Network Representation by Preserving Multi-Order Signed Proximity

Photo from wikipedia

Signed network representation is a key problem for signed network data. Previous studies have shown that by preserving multi-order signed proximity (SP), expressive node representations can be learned. However, multi-order… Click to show full abstract

Signed network representation is a key problem for signed network data. Previous studies have shown that by preserving multi-order signed proximity (SP), expressive node representations can be learned. However, multi-order SP cannot be perfectly encoded using limited samples extracted from random walks, which reduces effectiveness. To perfectly encode multi-order SP, we have innovatively integrated the informativeness of infinite samples to construct high-level summaries of multi-order SP without explicit sampling. Based on these summaries, we propose a method called SPMF, in which node representations are obtained using low-rank matrix approximation. Furthermore, we theoretically investigate the rationality of SPMF by examining its relationship with a powerful representation learning architecture. In sign inference and link prediction tasks with several real-world datasets, SPMF is empirically competitive compared with state-of-the-art methods. Additionally, two tricks are designed for improving the scalability of SPMF. One trick aims to filter out less informative summaries, and another one is inspired by kernel techniques. Both tricks empirically improve scalability while preserving effective performance. The code for our methods is publicly available.

Keywords: network representation; multi order; signed network; order

Journal Title: IEEE Transactions on Knowledge and Data Engineering
Year Published: 2023

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.