You are not currently logged in.
Access your personal account or get JSTOR access through your library or other institution:
If you need an accessible version of this item please contact JSTOR User Support
Preview not available
Grassberger suggested an interesting entropy estimator, namely, n log n/∑n i=1 Ln i, where Ln i is the shortest prefix of xi, xi+1,..., which is not a prefix of any other xj, xj+1,..., for j ≤ n. We show that this estimator is not consistent for the general ergodic process, although it is consistent for Markov chains. A weaker trimmed mean type result is proved for the general case, namely, given $\varepsilon > 0$, eventually almost surely all but an ε fraction of the Ln i/log n will be within ε of 1/H. A related Hausdorff dimension conjecture is shown to be false.
The Annals of Probability © 1992 Institute of Mathematical Statistics