Emergent Mind

Abstract

The paper considers the input-constrained binary erasure channel (BEC) with causal, noiseless feedback. The channel input sequence respects the $(d,\infty)$-runlength limited (RLL) constraint, i.e., any pair of successive $1$s must be separated by at least $d$ $0$s. We derive upper and lower bounds on the feedback capacity of this channel, for all $d\geq 1$, given by: $\max\limits{\delta \in [0,\frac{1}{d+1}]}R(\delta) \leq C{\text{fb}}{(d\infty)}(\epsilon) \leq \max\limits{\delta \in [0,\frac{1}{1+d\epsilon}]}R(\delta)$, where the function $R(\delta) = \frac{hb(\delta)}{d\delta + \frac{1}{1-\epsilon}}$, with $\epsilon\in [0,1]$ denoting the channel erasure probability, and $h_b(\cdot)$ being the binary entropy function. We note that our bounds are tight for the case when $d=1$ (see Sabag et al. (2016)), and, in addition, we demonstrate that for the case when $d=2$, the feedback capacity is equal to the capacity with non-causal knowledge of erasures, for $\epsilon \in [0,1-\frac{1}{2\log(3/2)}]$. For $d>1$, our bounds differ from the non-causal capacities (which serve as upper bounds on the feedback capacity) derived in Peled et al. (2019) in only the domains of maximization. The approach in this paper follows Sabag et al. (2017), by deriving single-letter bounds on the feedback capacity, based on output distributions supported on a finite $Q$-graph, which is a directed graph with edges labelled by output symbols.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.