You are not currently logged in.
Access JSTOR through your library or other institution:
If You Use a Screen ReaderThis content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Entropy and the Consistent Estimation of Joint Distributions
Katalin Marton and Paul C. Shields
The Annals of Probability
Vol. 22, No. 2 (Apr., 1994), pp. 960-977
Published by: Institute of Mathematical Statistics
Stable URL: http://www.jstor.org/stable/2244900
Page Count: 18
You can always find the topics here!Topics: Markov chains, Ergodic theory, Entropy, Alphabets, Mathematics, Cardinality, Integers, Bernoulli theorem, Bernoulli process, Terminology
Were these topics helpful?See somethings inaccurate? Let us know!
Select the topics that are inaccurate.
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Preview not available
The kth-order joint distribution for an ergodic finite-alphabet process can be estimated from a sample path of length n by sliding a window of length k along the sample path and counting frequencies of k-blocks. In this paper the problem of consistent estimation when k = k(n) grows as a function of n is addressed. It is shown that the variational distance between the true k(n)-block distribution and the empirical k(n)-block distribution goes to 0 almost surely for the class of weak Bernoulli processes, provided k(n) ≤ (log n)/(H + ε), where H is the entropy of the process. The weak Bernoulli class includes the i.i.d. processes, the aperiodic Markov chains and functions thereof and the aperiodic renewal processes. A similar result is also shown to hold for functions of irreducible Markov chains. This work sharpens prior results obtained for more general classes of processes by Ornstein and Weiss and by Ornstein and Shields, which used the d̄-distance rather than the variational distance.
The Annals of Probability © 1994 Institute of Mathematical Statistics