You are not currently logged in.
Access JSTOR through your library or other institution:
Capturing the Intangible Concept of Information
Ehsan S. Soofi
Journal of the American Statistical Association
Vol. 89, No. 428 (Dec., 1994), pp. 1243-1254
Stable URL: http://www.jstor.org/stable/2290988
Page Count: 12
Preview not available
The purpose of this article is to discuss the intricacies of quantifying information in some statistical problems. The aim is to develop a general appreciation for the meanings of information functions rather than their mathematical use. This theme integrates fundamental aspects of the contributions of Kullback, Lindley, and Jaynes and bridges chaos to probability modeling. A synopsis of information-theoretic statistics is presented in the form of a pyramid with Shannon at the vertex and a triangular base that signifies three distinct variants of quantifying information: discrimination information (Kullback), mutual information (Lindley), and maximum entropy information (Jaynes). Examples of capturing information by the maximum entropy (ME) method are discussed. It is shown that the ME approach produces a general class of logit models capable of capturing various forms of sample and nonsample information. Diagnostics for quantifying information captured by the ME logit models are given, and decomposition of information into orthogonal components is presented. Basic geometry is used to display information graphically in a simple example. An overview of quantifying information in chaotic systems is presented, and a discrimination information diagnostic for studying chaotic data is introduced. Finally, some brief comments about future research are given.
Journal of the American Statistical Association © 1994 American Statistical Association