Emergent Mind

The Price of Hierarchical Clustering

(2205.01417)
Published May 3, 2022 in cs.DS

Abstract

Hierarchical Clustering is a popular tool for understanding the hereditary properties of a data set. Such a clustering is actually a sequence of clusterings that starts with the trivial clustering in which every data point forms its own cluster and then successively merges two existing clusters until all points are in the same cluster. A hierarchical clustering achieves an approximation factor of $\alpha$ if the costs of each $k$-clustering in the hierarchy are at most $\alpha$ times the costs of an optimal $k$-clustering. We study as cost functions the maximum (discrete) radius of any cluster ($k$-center problem) and the maximum diameter of any cluster ($k$-diameter problem). In general, the optimal clusterings do not form a hierarchy and hence an approximation factor of $1$ cannot be achieved. We call the smallest approximation factor that can be achieved for any instance the price of hierarchy. For the $k$-diameter problem we improve the upper bound on the price of hierarchy to $3+2\sqrt{2}\approx 5.83$. Moreover we significantly improve the lower bounds for $k$-center and $k$-diameter, proving a price of hierarchy of exactly $4$ and $3+2\sqrt{2}$, respectively.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.