1
$\begingroup$

I am reading the paper "Generating random bits from an arbitrary source: fundamental limits" by Vembu and Verdu. This paper is written in the language of information theory, however, I need to translate one quantity in this paper in terms of computer science.

inf-entropy rate, $\underline{H}(X)$, of an arbitrary source is defined (Definition 3 in the paper) as the largest real number $\alpha$ (including $\infty$) that satisfies $$\lim_{n\to\infty}P\{x^n:~\frac{1}{n}\log\frac{1}{P_{X^n}(x^n)}\leq \alpha-\epsilon\}=0,$$ for all $\epsilon>0$.

Comparing this quantity with the definition of min-entropy, we can say that for i.i.d sources $\underline{H}(X)=H_{\infty}(X)$.

On the other hand, it is obvious that for all stationary and ergodic sources, the Shannon-McMillan theorem leads us to conclude that $\underline{H}(X)=\lim_{n\to\infty}\frac{1}{n}H(X^n)$ as mentioned in expression (2) in the paper.

Since i.i.d sources are a special class of stationary and ergodic sources for which $\lim_{n\to\infty}\frac{1}{n}H(X^n)=H(X)$, from above, we conclude that for any i.i.d sources, $H_{\infty}(X)=H(X)$ which is clearly not true in general. [This is only true when the source is assumed to be uniform too]. So there must be a mistake somewhere in the above argument. Could you please let me know where is the mistake?

$\endgroup$
1
  • $\begingroup$ What is the min-entropy and what is $H_\infty$? Could you also indicate the beginning and the end of the "above argument" you are referring to? $\endgroup$ Commented Mar 19, 2014 at 19:41

1 Answer 1

0
$\begingroup$

The proper of definition of inf-entropy is as follows: $$ \underline{H}(\mathbf{X})=\text{p-}\liminf_{n\to\infty}\frac{1}{n}\log\frac{1}{P_{X^n}(X^n)}.\\ $$ where $\mathbf{X}=(X_1,X_2,\dots)$ and $\text{p-}\liminf_{n\to\infty}$ is defined as follows: $$ \text{p-}\liminf_{n\to\infty}Z_n=\sup\left\{\beta\mid \lim_{n\to\infty}\Pr(Z_n<\beta)=0\right\}. $$ Now for i.i.d. source $\mathbf X$, we have: $$ \frac{1}{n}\log\frac{1}{P_{X^n}(X^n)}\geq \log \frac 1{\max_{x\in\mathcal X}P_X(x)}=H_{\infty}(X) $$ where $H_{\infty}(X)$ is Rényi entropy of $X$ with order $\infty$. This means that: $$ \lim_{n\to\infty}\Pr\{\frac{1}{n}\log\frac{1}{P_{X^n}(X^n)}< H_\infty(X)\}=0. $$ Now based on the definition of inf-entropy, you can only say: $$ H_{\infty}(X)\leq \underline{H}(\mathbf X). $$ As a matter of fact, if $\mathbf X$ is i.i.d., as you mentioned yourself, we have: $$ \underline{H}(\mathbf X)=H(X). $$ So basically you did not consider $\sup$ in the definition of inf-entropy.

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.