If you need an accessible version of this item please contact JSTOR User Support

Entropy and Prefixes

Paul C. Shields
The Annals of Probability
Vol. 20, No. 1 (Jan., 1992), pp. 403-409
Stable URL: http://www.jstor.org/stable/2244563
Page Count: 7
  • Download PDF
  • Cite this Item

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support
Entropy and Prefixes
Preview not available

Abstract

Grassberger suggested an interesting entropy estimator, namely, n log n/∑n i=1 Ln i, where Ln i is the shortest prefix of xi, xi+1,..., which is not a prefix of any other xj, xj+1,..., for j ≤ n. We show that this estimator is not consistent for the general ergodic process, although it is consistent for Markov chains. A weaker trimmed mean type result is proved for the general case, namely, given $\varepsilon > 0$, eventually almost surely all but an ε fraction of the Ln i/log n will be within ε of 1/H. A related Hausdorff dimension conjecture is shown to be false.

Page Thumbnails

  • Thumbnail: Page 
403
    403
  • Thumbnail: Page 
404
    404
  • Thumbnail: Page 
405
    405
  • Thumbnail: Page 
406
    406
  • Thumbnail: Page 
407
    407
  • Thumbnail: Page 
408
    408
  • Thumbnail: Page 
409
    409