Emergent Mind

Link Prediction in Real-World Multiplex Networks via Layer Reconstruction Method

(1906.09422)
Published Jun 22, 2019 in physics.soc-ph , cs.SI , and physics.data-an

Abstract

A large body of research on link prediction problem is devoted to finding missing links in single-layer (simplex) networks. The proposed link prediction methods compute a similarity measure between unconnected node pairs based on the observed structure of the network. However, extension of notion of similarity to multiplex networks is a two-fold challenge. The layers of real-world multiplex networks do not have the same organization yet are not of totally different organizations. So, it should be determined that how similar are the layers of a multiplex network. On the other hand, it is needed to be known that how similar layers can contribute in link prediction task on a target layer with missing links. Eigenvectors are known to well reflect the structural features of networks. Therefore, two layers of a multiplex network are similar w.r.t. structural features if they share similar eigenvectors. Experiments show that layers of real-world multiplex networks are similar w.r.t. structural features and the value of similarity is far beyond their randomized counterparts. Furthermore, it is shown that missing links are highly predictable if their addition or removal do not significantly change the network structural features. Otherwise, if the change is significant a similar copy of structural features may come to help. Based on this concept, Layer Reconstruction Method (LRM) finds the best reconstruction of the observed structure of the target layer with structural features of other similar layers. Experiments on real multiplex networks from different disciplines show that this method benefits from information redundancy in the networks and helps the performance of link prediction to stay robust even under high fraction of missing links.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.