Approximate Spectral Clustering: Efficiency and Guarantees (1509.09188v5)
Abstract: Approximate Spectral Clustering (ASC) is a popular and successful heuristic for partitioning the nodes of a graph $G$ into clusters for which the ratio of outside connections compared to the volume (sum of degrees) is small. ASC consists of the following two subroutines: i) compute an approximate Spectral Embedding via the Power method; and ii) partition the resulting vector set with an approximate $k$-means clustering algorithm. The resulting $k$-means partition naturally induces a $k$-way node partition of $G$. We give a comprehensive analysis of ASC building on the work of Peng et al.~(SICOMP'17), Boutsidis et al.~(ICML'15) and Ostrovsky et al.~(JACM'13). We show that ASC i) runs efficiently, and ii) yields a good approximation of an optimal $k$-way node partition of $G$. Moreover, we strengthen the quality guarantees of a structural result of Peng et al. by a factor of $k$, and simultaneously weaken the eigenvalue gap assumption. Further, we show that ASC finds a $k$-way node partition of $G$ with the strengthened quality guarantees.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.