Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 30 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 18 tok/s Pro
GPT-5 High 12 tok/s Pro
GPT-4o 91 tok/s Pro
Kimi K2 184 tok/s Pro
GPT OSS 120B 462 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

$E_γ$-Resolvability (1511.07829v2)

Published 24 Nov 2015 in cs.IT and math.IT

Abstract: The conventional channel resolvability refers to the minimum rate needed for an input process to approximate the channel output distribution in total variation distance. In this paper we study $E_{\gamma}$-resolvability, in which total variation is replaced by the more general $E_{\gamma}$ distance. A general one-shot achievability bound for the precision of such an approximation is developed. Let $Q_{\sf X|U}$ be a random transformation, $n$ be an integer, and $E\in(0,+\infty)$. We show that in the asymptotic setting where $\gamma=\exp(nE)$, a (nonnegative) randomness rate above $\inf_{Q_{\sf U}: D(Q_{\sf X}|{{\pi}}{\sf X})\le E} {D(Q{\sf X}|{{\pi}}{\sf X})+I(Q{\sf U},Q_{\sf X|U})-E}$ is sufficient to approximate the output distribution ${{\pi}}{\sf X}{\otimes n}$ using the channel $Q{\sf X|U}{\otimes n}$, where $Q_{\sf U}\to Q_{\sf X|U}\to Q_{\sf X}$, and is also necessary in the case of finite $\mathcal{U}$ and $\mathcal{X}$. In particular, a randomness rate of $\inf_{Q_{\sf U}}I(Q_{\sf U},Q_{\sf X|U})-E$ is always sufficient. We also study the convergence of the approximation error under the high probability criteria in the case of random codebooks. Moreover, by developing simple bounds relating $E_{\gamma}$ and other distance measures, we are able to determine the exact linear growth rate of the approximation errors measured in relative entropy and smooth R\'{e}nyi divergences for a fixed-input randomness rate. The new resolvability result is then used to derive 1) a one-shot upper bound on the probability of excess distortion in lossy compression, which is exponentially tight in the i.i.d.~setting, 2) a one-shot version of the mutual covering lemma, and 3) a lower bound on the size of the eavesdropper list to include the actual message and a lower bound on the eavesdropper false-alarm probability in the wiretap channel problem, which is (asymptotically) ensemble-tight.

Citations (12)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube