Emergent Mind

Abstract

The Sparsest Cut is a fundamental optimization problem that has been extensively studied. For planar inputs the problem is in $P$ and can be solved in $\tilde{O}(n3)$ time if all vertex weights are $1$. Despite a significant amount of effort, the best algorithms date back to the early 90's and can only achieve $O(\log n)$-approximation in $\tilde{O}(n)$ time or a constant factor approximation in $\tilde{O}(n2)$ time [Rao, STOC92]. Our main result is an $\Omega(n{2-\epsilon})$ lower bound for Sparsest Cut even in planar graphs with unit vertex weights, under the $(min,+)$-Convolution conjecture, showing that approximations are inevitable in the near-linear time regime. To complement the lower bound, we provide a constant factor approximation in near-linear time, improving upon the 25-year old result of Rao in both time and accuracy. Our lower bound accomplishes a repeatedly raised challenge by being the first fine-grained lower bound for a natural planar graph problem in P. Moreover, we prove near-quadratic lower bounds under SETH for variants of the closest pair problem in planar graphs, and use them to show that the popular Average-Linkage procedure for Hierarchical Clustering cannot be simulated in truly subquadratic time. We prove an $\Omega(n/\log{n})$ lower bound on the number of communication rounds required to compute the weighted diameter of a network in the CONGEST model, even when the underlying graph is planar and all nodes are $D=4$ hops away from each other. This is the first poly($n$) + $\omega(D)$ lower bound in the planar-distributed setting, and it complements the recent poly$(D, \log{n})$ upper bounds of Li and Parter [STOC 2019] for (exact) unweighted diameter and for ($1+\epsilon$) approximate weighted diameter.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.