Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support

On a Measure of Association

S. D. Silvey
The Annals of Mathematical Statistics
Vol. 35, No. 3 (Sep., 1964), pp. 1157-1166
Stable URL: http://www.jstor.org/stable/2238245
Page Count: 10
  • Read Online (Free)
  • Download ($19.00)
  • Subscribe ($19.50)
  • Cite this Item
If you need an accessible version of this item please contact JSTOR User Support
On a Measure of Association
Preview not available

Abstract

The problem of obtaining a satisfactory measure of association between two random variables is closely allied to that of obtaining a measure of the amount of information about one contained in the other. For the more closely associated are the random variables the more information about one ought to be given by an observation on the other and vice versa. It is not, therefore, surprising to find that there have been several suggestions for basing coefficients of association on the now celebrated measure of information introduced by Shannon [9] in the context of communication theory. (See Bell [1] for certain of these and for references to others). Now Shannon's measure of information was based on the notion of entropy which seems to be much more meaningful for finite probability spaces than it is for infinite spaces, and while Gel'fand and Yaglom [2] have suggested a generalisation of Shannon's measure for infinite spaces, there remain difficulties, as indicated by Bell [1], about deriving from it coefficients of association or dependence between random variables taking infinite sets of values. In the present paper, by adopting a slightly different attitude to information from that of communication theory, we shall obtain a general measure of information which yields a fairly natural coefficient of dependence between two continuous random variables or, more generally, between two non-atomic measures. The next section provides the motivation for the introduction of this measure of information and a general definition is given in Section 3. In Section 4 we discuss some of the properties of this measure regarded as a coefficient of association along the lines suggested by Rényi [8]. Finally, in Section 5, we indicate the relevance of this measure to estimation theory.

Page Thumbnails

  • Thumbnail: Page 
1157
    1157
  • Thumbnail: Page 
1158
    1158
  • Thumbnail: Page 
1159
    1159
  • Thumbnail: Page 
1160
    1160
  • Thumbnail: Page 
1161
    1161
  • Thumbnail: Page 
1162
    1162
  • Thumbnail: Page 
1163
    1163
  • Thumbnail: Page 
1164
    1164
  • Thumbnail: Page 
1165
    1165
  • Thumbnail: Page 
1166
    1166