Emergent Mind

Network structural perturbation against interlayer link prediction

(2205.09079)
Published May 18, 2022 in physics.soc-ph and cs.SI

Abstract

Interlayer link prediction aims at matching the same entities across different layers of the multiplex network. Existing studies attempt to predict more accurately, efficiently, or generically from the aspects of network structure, attribute characteristics, and their combination. Few of them analyze the effects of intralayer links. Namely, few works study the backbone structures which can effectively preserve the predictive accuracy while dealing with a smaller number of intralayer links. It can be used to investigate what types of intralayer links are most important for correct prediction. Are there any intralayer links whose presence leads to worse predictive performance than their absence, and how to attack the prediction algorithms at the minimum cost? To this end, two kinds of network structural perturbation methods are proposed. For the scenario where the structural information of the whole network is completely known, we offer a global perturbation strategy that gives different perturbation weights to different types of intralayer links and then selects a predetermined proportion of intralayer links to remove according to the weights. In contrast, if these information cannot be obtained at one time, we design a biased random walk procedure, local perturbation strategy, to execute perturbation. Four kinds of interlayer link prediction algorithms are carried out on different real-world and artificial perturbed multiplex networks. We find out that the intralayer links connected with small degree nodes have the most significant impact on the prediction accuracy. The intralayer links connected with large degree nodes may have side effects on the interlayer link prediction.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.