Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 52 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 13 tok/s Pro
GPT-4o 100 tok/s Pro
Kimi K2 192 tok/s Pro
GPT OSS 120B 454 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Maximum Selection and Sorting with Adversarial Comparators and an Application to Density Estimation (1606.02786v1)

Published 9 Jun 2016 in cs.DS, cs.IT, and math.IT

Abstract: We study maximum selection and sorting of $n$ numbers using pairwise comparators that output the larger of their two inputs if the inputs are more than a given threshold apart, and output an adversarially-chosen input otherwise. We consider two adversarial models. A non-adaptive adversary that decides on the outcomes in advance based solely on the inputs, and an adaptive adversary that can decide on the outcome of each query depending on previous queries and outcomes. Against the non-adaptive adversary, we derive a maximum-selection algorithm that uses at most $2n$ comparisons in expectation, and a sorting algorithm that uses at most $2n \ln n$ comparisons in expectation. These numbers are within small constant factors from the best possible. Against the adaptive adversary, we propose a maximum-selection algorithm that uses $\Theta(n\log (1/{\epsilon}))$ comparisons to output a correct answer with probability at least $1-\epsilon$. The existence of this algorithm affirmatively resolves an open problem of Ajtai, Feldman, Hassadim, and Nelson. Our study was motivated by a density-estimation problem where, given samples from an unknown underlying distribution, we would like to find a distribution in a known class of $n$ candidate distributions that is close to underlying distribution in $\ell_1$ distance. Scheffe's algorithm outputs a distribution at an $\ell_1$ distance at most 9 times the minimum and runs in time $\Theta(n2\log n)$. Using maximum selection, we propose an algorithm with the same approximation guarantee but run time of $\Theta(n\log n)$.

Citations (6)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.