Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

Entropy and Prefixes

Paul C. Shields
The Annals of Probability
Vol. 20, No. 1 (Jan., 1992), pp. 403-409
Stable URL: http://www.jstor.org/stable/2244563
Page Count: 7
Preview not available

Abstract

Grassberger suggested an interesting entropy estimator, namely, n log n/∑n i=1 Ln i, where Ln i is the shortest prefix of xi, xi+1,..., which is not a prefix of any other xj, xj+1,..., for j ≤ n. We show that this estimator is not consistent for the general ergodic process, although it is consistent for Markov chains. A weaker trimmed mean type result is proved for the general case, namely, given $\varepsilon > 0$, eventually almost surely all but an ε fraction of the Ln i/log n will be within ε of 1/H. A related Hausdorff dimension conjecture is shown to be false.

• 403
• 404
• 405
• 406
• 407
• 408
• 409