Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support

An Application of Information Theory to Multivariate Analysis

S. Kullback
The Annals of Mathematical Statistics
Vol. 23, No. 1 (Mar., 1952), pp. 88-102
Stable URL: http://www.jstor.org/stable/2236403
Page Count: 15
  • Read Online (Free)
  • Download ($19.00)
  • Cite this Item
If you need an accessible version of this item please contact JSTOR User Support
An Application of Information Theory to Multivariate Analysis
Preview not available

Abstract

The problem considered is that of finding the "best" linear function for discriminating between two multivariate normal populations, π1 and π2, without limitation to the case of equal covariance matrices. The "best" linear function is found by maximizing the divergence, J'(1, 2), between the distributions of the linear function. Comparison with the divergence, J(1, 2), between π1 and π2 offers a measure of the discriminating efficiency of the linear function, since J(1, 2) ≥ J'(1, 2). The divergence, a special case of which is Mahalanobis's Generalized Distance, is defined in terms of a measure of information which is essentially that of Shannon and Wiener. Appropriate assumptions about π1 and π2 lead to discriminant analysis (Sections 4, 7), principal components (Section 5), and canonical correlations (Section 6).

Page Thumbnails

  • Thumbnail: Page 
88
    88
  • Thumbnail: Page 
89
    89
  • Thumbnail: Page 
90
    90
  • Thumbnail: Page 
91
    91
  • Thumbnail: Page 
92
    92
  • Thumbnail: Page 
93
    93
  • Thumbnail: Page 
94
    94
  • Thumbnail: Page 
95
    95
  • Thumbnail: Page 
96
    96
  • Thumbnail: Page 
97
    97
  • Thumbnail: Page 
98
    98
  • Thumbnail: Page 
99
    99
  • Thumbnail: Page 
100
    100
  • Thumbnail: Page 
101
    101
  • Thumbnail: Page 
102
    102