Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support

Information and Sufficient Sub-fields

S. G. Ghurye
The Annals of Mathematical Statistics
Vol. 39, No. 6 (Dec., 1968), pp. 2056-2066
Stable URL: http://www.jstor.org/stable/2239302
Page Count: 11
  • Read Online (Free)
  • Download ($19.00)
  • Subscribe ($19.50)
  • Cite this Item
If you need an accessible version of this item please contact JSTOR User Support
Information and Sufficient Sub-fields
Preview not available

Abstract

This paper is the result of an attempt to clarify and improve some results in the theory of statistical information. The term information is used to denote different things in different contexts. First of all, there is Shannon's information, -∑ pi log pi, defined for probability distributions on a finite sample space; this measures, in an esthetically satisfactory way, the entropy or amount of uncertainty in a distribution. Then there is Wiener's information, ∫ f(x) log f(x) dx, defined for an absolutely continuous distribution on the line (or in n-space); it was introduced by Wiener, with an acknowledgment to von Neumann, as a "reasonable measure" of the amount of information, having the property of being "the negative of the quantity usually defined as entropy in similar situations" ([10], p. 76). Finally, there is "information of one probability distribution P with respect to another Q," commonly known as Kullback-Leibler information. On a finite sample space, this has the form ∑ pi log (pi/qi) = - ∑ pi log qi - (- ∑ pi log pi), and thus has some relationship to entropy; note that the second term, which is the entropy of {pi}, is the minimum of the first expression over all distributions {qi}. An interesting idea due to Gelfand, Kolmogorov and Yaglom [3] establishes a connection between the Kullback-Leibler information for a finite probability space and that for any space: If P, Q are probability measures on a measurable space (Ω, F), P ≪ Q, and {Ai, i = 1, ⋯, n} is any finite measurable partition of Ω, then the supremum of ∑i log [ P(Ai)/Q(Ai)] P(Ai) over all finite measurable partitions is ∫Ω log (dP/dQ)dP. The only published proof of this result seems to be that due to Kallianpur [5], which uses martingale theory. In Section 1, we shall obtain a rather simple direct proof of this result (Theorem 1.1) and extend it to the case where Q is any σ-finite measure (Theorem 1.2). Wiener's information is then seen to be the supremum of ∑ log [ P(Ai)/Q(Ai)] P(Ai) over countable partitions, with Q = Lebesgue measure. Section 2 will be concerned with Kullback-Leibler information. We shall define conditional information relative to a sub-field, establish a relation between this conditional information and sufficiency of the sub-field (Theorem 2.2), and also show that this conditional information equals the difference between information contained in the field and that in the sub-field (Theorem 2.3). These are extensions of results obtained by Kullback and Leibler in a somewhat limited context.

Page Thumbnails

  • Thumbnail: Page 
2056
    2056
  • Thumbnail: Page 
2057
    2057
  • Thumbnail: Page 
2058
    2058
  • Thumbnail: Page 
2059
    2059
  • Thumbnail: Page 
2060
    2060
  • Thumbnail: Page 
2061
    2061
  • Thumbnail: Page 
2062
    2062
  • Thumbnail: Page 
2063
    2063
  • Thumbnail: Page 
2064
    2064
  • Thumbnail: Page 
2065
    2065
  • Thumbnail: Page 
2066
    2066