You are not currently logged in.
Access JSTOR through your library or other institution:
If You Use a Screen ReaderThis content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Information Gain and a General Measure of Correlation
John T. Kent
Vol. 70, No. 1 (Apr., 1983), pp. 163-173
Stable URL: http://www.jstor.org/stable/2335954
Page Count: 11
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Preview not available
Given a parametric model of dependence between two random quantities, X and Y, the notion of information gain can be used to define a measure of correlation. This definition of correlation generalizes both the usual product-moment correlation coefficient for the bivariate normal model and the multiple correlation coefficient in the standard linear regression model. The use of this information-based correlation in a descriptive statistical analysis is examined and several examples are given.
Biometrika © 1983 Biometrika Trust