You are not currently logged in.
Access JSTOR through your library or other institution:
If You Use a Screen ReaderThis content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
On a Measure of Association
S. D. Silvey
The Annals of Mathematical Statistics
Vol. 35, No. 3 (Sep., 1964), pp. 1157-1166
Published by: Institute of Mathematical Statistics
Stable URL: http://www.jstor.org/stable/2238245
Page Count: 10
You can always find the topics here!Topics: Random variables, Variable coefficients, Communication theory, Statism, Entropy, Information theory, Correlation coefficients, Analytics, Mathematical sets
Were these topics helpful?See something inaccurate? Let us know!
Select the topics that are inaccurate.
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Preview not available
The problem of obtaining a satisfactory measure of association between two random variables is closely allied to that of obtaining a measure of the amount of information about one contained in the other. For the more closely associated are the random variables the more information about one ought to be given by an observation on the other and vice versa. It is not, therefore, surprising to find that there have been several suggestions for basing coefficients of association on the now celebrated measure of information introduced by Shannon  in the context of communication theory. (See Bell  for certain of these and for references to others). Now Shannon's measure of information was based on the notion of entropy which seems to be much more meaningful for finite probability spaces than it is for infinite spaces, and while Gel'fand and Yaglom  have suggested a generalisation of Shannon's measure for infinite spaces, there remain difficulties, as indicated by Bell , about deriving from it coefficients of association or dependence between random variables taking infinite sets of values. In the present paper, by adopting a slightly different attitude to information from that of communication theory, we shall obtain a general measure of information which yields a fairly natural coefficient of dependence between two continuous random variables or, more generally, between two non-atomic measures. The next section provides the motivation for the introduction of this measure of information and a general definition is given in Section 3. In Section 4 we discuss some of the properties of this measure regarded as a coefficient of association along the lines suggested by Rényi . Finally, in Section 5, we indicate the relevance of this measure to estimation theory.
The Annals of Mathematical Statistics © 1964 Institute of Mathematical Statistics