Emergent Mind

Relating modularity maximization and stochastic block models in multilayer networks

(1804.01964)
Published Apr 5, 2018 in cs.SI , math.PR , physics.data-an , and physics.soc-ph

Abstract

Characterizing large-scale organization in networks, including multilayer networks, is one of the most prominent topics in network science and is important for many applications. One type of mesoscale feature is community structure, in which sets of nodes are densely connected internally but sparsely connected to other dense sets of nodes. Two of the most popular approaches for community detection are to maximize an objective function called "modularity" and to perform statistical inference using stochastic block models. Generalizing work by Newman on monolayer networks (Physical Review E 94, 052315), we show in multilayer networks that maximizing modularity is equivalent, under certain conditions, to maximizing the posterior probability of community assignments under a suitably chosen stochastic block model. We derive versions of this equivalence for various types of multilayer structure, including temporal, multiplex, and multilevel networks. We consider cases in which the key parameters are constant, as well as ones in which they vary across layers; in the latter case, this yields a novel, layer-weighted version of the modularity function. Our results also help address a longstanding difficulty of multilayer modularity-maximization algorithms, which require the specification of two sets of tuning parameters that have been difficult to choose in practice. We show how to perform this parameter selection in a statistically-grounded way, and we demonstrate the effectiveness of our approach on both synthetic and empirical networks.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.