Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning low-rank latent mesoscale structures in networks (2102.06984v5)

Published 13 Feb 2021 in cs.SI, cs.LG, math.OC, physics.soc-ph, and stat.ML

Abstract: It is common to use networks to encode the architecture of interactions between entities in complex systems in the physical, biological, social, and information sciences. To study the large-scale behavior of complex systems, it is useful to examine mesoscale structures in networks as building blocks that influence such behavior. We present a new approach for describing low-rank mesoscale structures in networks, and we illustrate our approach using several synthetic network models and empirical friendship, collaboration, and protein--protein interaction (PPI) networks. We find that these networks possess a relatively small number of latent motifs' that together can successfully approximate most subgraphs of a network at a fixed mesoscale. We use an algorithm fornetwork dictionary learning' (NDL), which combines a network-sampling method and nonnegative matrix factorization, to learn the latent motifs of a given network. The ability to encode a network using a set of latent motifs has a wide variety of applications to network-analysis tasks, such as comparison, denoising, and edge inference. Additionally, using a new network denoising and reconstruction (NDR) algorithm, we demonstrate how to denoise a corrupted network by using only the latent motifs that one learns directly from the corrupted network.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Hanbaek Lyu (47 papers)
  2. Yacoub H. Kureh (4 papers)
  3. Joshua Vendrow (13 papers)
  4. Mason A. Porter (210 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.