Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 58 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 13 tok/s Pro
GPT-5 High 15 tok/s Pro
GPT-4o 86 tok/s Pro
Kimi K2 208 tok/s Pro
GPT OSS 120B 447 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

A directed isoperimetric inequality with application to Bregman near neighbor lower bounds (1404.1191v2)

Published 4 Apr 2014 in cs.CG and cs.CC

Abstract: Bregman divergences $D_\phi$ are a class of divergences parametrized by a convex function $\phi$ and include well known distance functions like $\ell_22$ and the Kullback-Leibler divergence. There has been extensive research on algorithms for problems like clustering and near neighbor search with respect to Bregman divergences, in all cases, the algorithms depend not just on the data size $n$ and dimensionality $d$, but also on a structure constant $\mu \ge 1$ that depends solely on $\phi$ and can grow without bound independently. In this paper, we provide the first evidence that this dependence on $\mu$ might be intrinsic. We focus on the problem of approximate near neighbor search for Bregman divergences. We show that under the cell probe model, any non-adaptive data structure (like locality-sensitive hashing) for $c$-approximate near-neighbor search that admits $r$ probes must use space $\Omega(n{1 + \frac{\mu}{c r}})$. In contrast, for LSH under $\ell_1$ the best bound is $\Omega(n{1+\frac{1}{cr}})$. Our new tool is a directed variant of the standard boolean noise operator. We show that a generalization of the Bonami-Beckner hypercontractivity inequality exists "in expectation" or upon restriction to certain subsets of the Hamming cube, and that this is sufficient to prove the desired isoperimetric inequality that we use in our data structure lower bound. We also present a structural result reducing the Hamming cube to a Bregman cube. This structure allows us to obtain lower bounds for problems under Bregman divergences from their $\ell_1$ analog. In particular, we get a (weaker) lower bound for approximate near neighbor search of the form $\Omega(n{1 + \frac{1}{cr}})$ for an $r$-query non-adaptive data structure, and new cell probe lower bounds for a number of other near neighbor questions in Bregman space.

Citations (16)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.