Graph representation learning has re-emerged as a fascinating research topic due to the successful application of graph convolutional networks (GCNs) for graphs and inspires various downstream tasks, such as node… Click to show full abstract
Graph representation learning has re-emerged as a fascinating research topic due to the successful application of graph convolutional networks (GCNs) for graphs and inspires various downstream tasks, such as node classification and link prediction. Nevertheless, existing GCN-based methods for graph representation learning mainly focus on static graphs. Although some methods consider the dynamic characteristics of networks, the global structure information, which helps a node to gain worthy features from distant but valuable nodes, has not received enough attention. Moreover, these methods generally update the features of the nodes by averaging the features of neighboring nodes, which may not effectively consider the importance of different neighboring nodes during the aggregation. In this article, we propose a novel representation learning for dynamic graphs based on the GCNs, called DGCN. More specifically, the long short-term memory (LSTM) is utilized to update the weight parameters of GCN for capturing the global structure information across all time steps of dynamic graphs. Besides, a new Dice similarity is proposed to overcome the problem that the influence of directed neighbors is unnoticeable, which is further used to guide the aggregation. We evaluate the performance of the proposed method in the field of node clustering and link prediction, and the experimental results show a generally better performance of our proposed DGCN than baseline methods.
               
Click one of the above tabs to view related content.