Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 39 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 12 tok/s Pro
GPT-5 High 18 tok/s Pro
GPT-4o 91 tok/s Pro
Kimi K2 191 tok/s Pro
GPT OSS 120B 456 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

Rényi Resolvability and Its Applications to the Wiretap Channel (1707.00810v3)

Published 4 Jul 2017 in cs.IT, cs.CR, and math.IT

Abstract: The conventional channel resolvability problem refers to the determination of the minimum rate required for an input process so that the output distribution approximates a target distribution in either the total variation distance or the relative entropy. In contrast to previous works, in this paper, we use the (normalized or unnormalized) R\'enyi divergence (with the R\'enyi parameter in $[0,2]\cup{\infty}$) to measure the level of approximation. We also provide asymptotic expressions for normalized R\'enyi divergence when the R\'enyi parameter is larger than or equal to $1$ as well as (lower and upper) bounds for the case when the same parameter is smaller than $1$. We characterize the R\'enyi resolvability, which is defined as the minimum rate required to ensure that the R\'enyi divergence vanishes asymptotically. The R\'enyi resolvabilities are the same for both the normalized and unnormalized divergence cases. In addition, when the R\'enyi parameter smaller than~$1$, consistent with the traditional case where the R\'enyi parameter is equal to~$1$, the R\'enyi resolvability equals the minimum mutual information over all input distributions that induce the target output distribution. When the R\'enyi parameter is larger than $1$ the R\'enyi resolvability is, in general, larger than the mutual information. The optimal R\'enyi divergence is proven to vanish at least exponentially fast for both of these two cases, as long as the code rate is larger than the R\'enyi resolvability. The optimal exponential rate of decay for i.i.d.\ random codes is also characterized exactly. We apply these results to the wiretap channel, and completely characterize the optimal tradeoff between the rates of the secret and non-secret messages when the leakage measure is given by the (unnormalized) R\'enyi divergence.

Citations (42)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.