Emergent Mind

Deep Overlapping Community Search via Subspace Embedding

(2404.14692)
Published Apr 23, 2024 in cs.SI , cs.DB , and physics.soc-ph

Abstract

Community search (CS) aims to identify a set of nodes based on a specified query, leveraging structural cohesiveness and attribute homogeneity. This task enjoys various applications, ranging from fraud detection to recommender systems. In contrast to algorithm-based approaches, graph neural network (GNN) based methods define communities using ground truth labels, leveraging prior knowledge to explore patterns from graph structures and node features. However, existing solutions face three major limitations: 1) GNN-based models primarily focus on the disjoint community structure, disregarding the nature of nodes belonging to multiple communities. 2) These model structures suffer from low-order awareness and severe efficiency issues. 3) The identified community is subject to the free-rider and boundary effects. In this paper, we propose Simplified Multi-hop Attention Networks (SMN), which consist of three designs. First, we introduce a subspace community embedding technique called Sparse Subspace Filter (SSF). SSF enables the projection of community embeddings into distinct vector subspaces, accommodating the nature of overlapping and nesting community structures. In addition, we propose a lightweight model structure and a hop-wise attention mechanism to capture high-order patterns while improving model efficiency. Furthermore, two search algorithms are developed to minimize the latent space's community radius, addressing the challenges of free-rider and boundary effects. To the best of our knowledge, this is the first learning-based study of overlapping community search. Extensive experiments validate the superior performance of SMN compared with the state-of-the-art approaches. SMN achieves 14.73% improvements in F1-Score and up to 3 orders of magnitude acceleration in model efficiency.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.