If you need an accessible version of this item please contact JSTOR User Support

Approximations for the Entropy for Functions of Markov Chains

John J. Birch
The Annals of Mathematical Statistics
Vol. 33, No. 3 (Sep., 1962), pp. 930-938
Stable URL: http://www.jstor.org/stable/2237870
Page Count: 9
  • Download PDF
  • Cite this Item

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support
Approximations for the Entropy for Functions of Markov Chains
Preview not available

Abstract

If {Yn} is a stationary ergodic Markov process taking on values in a finite set {1, 2, ⋯, A}, then its entropy can be calculated directly. If φ is a function defined on 1, 2, ⋯, A, with values 1, 2, ⋯, D, no comparable formula is available for the entropy of the process {Yn = φ(Yn)}. However, the entropy of this functional process can be approximated by the monotonic functions Ḡn = h(Xn ∣ Xn-1, ⋯, X1) and $\underline{G}_n = h(X_n \mid X_{n-1}, \cdots, X_1, Y_0)$, the conditional entropies. Furthermore, if the underlying Markov process {Yn} has strictly positive transition probabilities, these two approximations converge exponentially to the entropy H, where the convergence is given by 0 ≤ Ḡn - H ≤ Bρ n-1 and $0 \leqq H - \underline{G}_n \leqq B\rho^{n-1}$ with $0 < \rho < 1, \rho$ being independent of the function φ.

Page Thumbnails

  • Thumbnail: Page 
930
    930
  • Thumbnail: Page 
931
    931
  • Thumbnail: Page 
932
    932
  • Thumbnail: Page 
933
    933
  • Thumbnail: Page 
934
    934
  • Thumbnail: Page 
935
    935
  • Thumbnail: Page 
936
    936
  • Thumbnail: Page 
937
    937
  • Thumbnail: Page 
938
    938