Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If You Use a Screen Reader

This content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.

Entropy and Prefixes

Paul C. Shields
The Annals of Probability
Vol. 20, No. 1 (Jan., 1992), pp. 403-409
Stable URL: http://www.jstor.org/stable/2244563
Page Count: 7
  • Read Online (Free)
  • Download ($19.00)
  • Subscribe ($19.50)
  • Cite this Item
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Entropy and Prefixes
Preview not available

Abstract

Grassberger suggested an interesting entropy estimator, namely, n log n/∑n i=1 Ln i, where Ln i is the shortest prefix of xi, xi+1,..., which is not a prefix of any other xj, xj+1,..., for j ≤ n. We show that this estimator is not consistent for the general ergodic process, although it is consistent for Markov chains. A weaker trimmed mean type result is proved for the general case, namely, given $\varepsilon > 0$, eventually almost surely all but an ε fraction of the Ln i/log n will be within ε of 1/H. A related Hausdorff dimension conjecture is shown to be false.

Page Thumbnails

  • Thumbnail: Page 
403
    403
  • Thumbnail: Page 
404
    404
  • Thumbnail: Page 
405
    405
  • Thumbnail: Page 
406
    406
  • Thumbnail: Page 
407
    407
  • Thumbnail: Page 
408
    408
  • Thumbnail: Page 
409
    409