Noisy k-means++ Revisited (2307.13685v1)
Abstract: The $k$-means++ algorithm by Arthur and Vassilvitskii [SODA 2007] is a classical and time-tested algorithm for the $k$-means problem. While being very practical, the algorithm also has good theoretical guarantees: its solution is $O(\log k)$-approximate, in expectation. In a recent work, Bhattacharya, Eube, Roglin, and Schmidt [ESA 2020] considered the following question: does the algorithm retain its guarantees if we allow for a slight adversarial noise in the sampling probability distributions used by the algorithm? This is motivated e.g. by the fact that computations with real numbers in $k$-means++ implementations are inexact. Surprisingly, the analysis under this scenario gets substantially more difficult and the authors were able to prove only a weaker approximation guarantee of $O(\log2 k)$. In this paper, we close the gap by providing a tight, $O(\log k)$-approximate guarantee for the $k$-means++ algorithm with noise.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.