Emergent Mind

Contextual Search via Intrinsic Volumes

(1804.03195)
Published Apr 9, 2018 in cs.DS , cs.LG , and math.MG

Abstract

We study the problem of contextual search, a multidimensional generalization of binary search that captures many problems in contextual decision-making. In contextual search, a learner is trying to learn the value of a hidden vector $v \in [0,1]d$. Every round the learner is provided an adversarially-chosen context $ut \in \mathbb{R}d$, submits a guess $pt$ for the value of $\langle ut, v\rangle$, learns whether $pt < \langle ut, v\rangle$, and incurs loss $\ell(\langle ut, v\rangle, pt)$ (for some loss function $\ell$). The learner's goal is to minimize their total loss over the course of $T$ rounds. We present an algorithm for the contextual search problem for the symmetric loss function $\ell(\theta, p) = |\theta - p|$ that achieves $O{d}(1)$ total loss. We present a new algorithm for the dynamic pricing problem (which can be realized as a special case of the contextual search problem) that achieves $O{d}(\log \log T)$ total loss, improving on the previous best known upper bounds of $O{d}(\log T)$ and matching the known lower bounds (up to a polynomial dependence on $d$). Both algorithms make significant use of ideas from the field of integral geometry, most notably the notion of intrinsic volumes of a convex set. To the best of our knowledge this is the first application of intrinsic volumes to algorithm design.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.