Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 49 tok/s
Gemini 2.5 Pro 53 tok/s Pro
GPT-5 Medium 19 tok/s Pro
GPT-5 High 16 tok/s Pro
GPT-4o 103 tok/s Pro
Kimi K2 172 tok/s Pro
GPT OSS 120B 472 tok/s Pro
Claude Sonnet 4 39 tok/s Pro
2000 character limit reached

Recovery from Non-Decomposable Distance Oracles (2209.05676v2)

Published 13 Sep 2022 in cs.DS, cs.CC, cs.IT, and math.IT

Abstract: A line of work has looked at the problem of recovering an input from distance queries. In this setting, there is an unknown sequence $s \in {0,1}{\leq n}$, and one chooses a set of queries $y \in {0,1}{\mathcal{O}(n)}$ and receives $d(s,y)$ for a distance function $d$. The goal is to make as few queries as possible to recover $s$. Although this problem is well-studied for decomposable distances, i.e., distances of the form $d(s,y) = \sum_{i=1}n f(s_i, y_i)$ for some function $f$, which includes the important cases of Hamming distance, $\ell_p$-norms, and $M$-estimators, to the best of our knowledge this problem has not been studied for non-decomposable distances, for which there are important special cases such as edit distance, dynamic time warping (DTW), Frechet distance, earth mover's distance, and so on. We initiate the study and develop a general framework for such distances. Interestingly, for some distances such as DTW or Frechet, exact recovery of the sequence $s$ is provably impossible, and so we show by allowing the characters in $y$ to be drawn from a slightly larger alphabet this then becomes possible. In a number of cases we obtain optimal or near-optimal query complexity. We also study the role of adaptivity for a number of different distance functions. One motivation for understanding non-adaptivity is that the query sequence can be fixed and the distances of the input to the queries provide a non-linear embedding of the input, which can be used in downstream applications involving, e.g., neural networks for natural language processing.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.