Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 24 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 77 tok/s Pro
Kimi K2 159 tok/s Pro
GPT OSS 120B 431 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Approximating Operator Norms via Generalized Krivine Rounding (1804.03644v2)

Published 10 Apr 2018 in cs.DS and math.FA

Abstract: We consider the $(\ell_p,\ell_r)$-Grothendieck problem, which seeks to maximize the bilinear form $yT A x$ for an input matrix $A$ over vectors $x,y$ with $|x|p=|y|_r=1$. The problem is equivalent to computing the $p \to r*$ operator norm of $A$. The case $p=r=\infty$ corresponds to the classical Grothendieck problem. Our main result is an algorithm for arbitrary $p,r \ge 2$ with approximation ratio $(1+\epsilon_0)/(\sinh{-1}(1)\cdot \gamma{p*} \,\gamma_{r*})$ for some fixed $\epsilon_0 \le 0.00863$. Comparing this with Krivine's approximation ratio of $(\pi/2)/\sinh{-1}(1)$ for the original Grothendieck problem, our guarantee is off from the best known hardness factor of $(\gamma_{p*} \gamma_{r*}){-1}$ for the problem by a factor similar to Krivine's defect. Our approximation follows by bounding the value of the natural vector relaxation for the problem which is convex when $p,r \ge 2$. We give a generalization of random hyperplane rounding and relate the performance of this rounding to certain hypergeometric functions, which prescribe necessary transformations to the vector solution before the rounding is applied. Unlike Krivine's Rounding where the relevant hypergeometric function was $\arcsin$, we have to study a family of hypergeometric functions. The bulk of our technical work then involves methods from complex analysis to gain detailed information about the Taylor series coefficients of the inverses of these hypergeometric functions, which then dictate our approximation factor. Our result also implies improved bounds for "factorization through $\ell_{2}{\,n}$" of operators from $\ell_{p}{\,n}$ to $\ell_{q}{\,m}$ (when $p\geq 2 \geq q$)--- such bounds are of significant interest in functional analysis and our work provides modest supplementary evidence for an intriguing parallel between factorizability, and constant-factor approximability.

Citations (7)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.