You are not currently logged in.
Access your personal account or get JSTOR access through your library or other institution:
If You Use a Screen ReaderThis content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Entropy and Prefixes
Paul C. Shields
The Annals of Probability
Vol. 20, No. 1 (Jan., 1992), pp. 403-409
Published by: Institute of Mathematical Statistics
Stable URL: http://www.jstor.org/stable/2244563
Page Count: 7
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Preview not available
Grassberger suggested an interesting entropy estimator, namely, n log n/∑n i=1 Ln i, where Ln i is the shortest prefix of xi, xi+1,..., which is not a prefix of any other xj, xj+1,..., for j ≤ n. We show that this estimator is not consistent for the general ergodic process, although it is consistent for Markov chains. A weaker trimmed mean type result is proved for the general case, namely, given $\varepsilon > 0$, eventually almost surely all but an ε fraction of the Ln i/log n will be within ε of 1/H. A related Hausdorff dimension conjecture is shown to be false.
The Annals of Probability © 1992 Institute of Mathematical Statistics