Emergent Mind

Relative entropies and their use in quantum information theory

(1611.08802)
Published Nov 27, 2016 in quant-ph , cs.IT , math-ph , math.IT , and math.MP

Abstract

This dissertation investigates relative entropies, also called generalized divergences, and how they can be used to characterize information-theoretic tasks in quantum information theory. The main goal is to further refine characterizations of the optimal rates for quantum source coding, state redistribution, and measurement compression with quantum side information via second order asymptotic expansions and strong converse theorems. The dissertation consists of a mathematical and an information-theoretic part. In the mathematical part, we focus on the $\alpha$-sandwiched R\'enyi divergence ($\alpha$-SRD). We first investigate the limit $\alpha\to 0$ to determine whether this recovers the well-known $0$-R\'enyi relative divergence. We then prove various new results for entropic quantities derived from the $\alpha$-SRD, including dimension bounds and useful bounds in terms of the fidelity between two quantum states. Furthermore, we derive a necessary and sufficient algebraic condition for equality in the data processing inequality (viz. monotonicity under quantum operations) for the $\alpha$-SRD, and give applications to entropic bounds. In the information-theoretic part, we first derive the second order asymptotics of visible quantum source coding using a mixed source. For the achievability part, we develop universal quantum source codes achieving a given second order rate for a memoryless source. As a corollary of the main result, we obtain the second order asymptotics of quantum source coding using a single memoryless source. We then prove strong converse theorems for state redistribution (with or without feedback) and measurement compression with quantum side information. The key ingredients in proving these theorems are the aforementioned fidelity bounds on R\'enyi entropic quantities derived from the $\alpha$-SRD.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.