Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If You Use a Screen Reader

This content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.

Mutual Dependence of Random Variables and Maximum Discretized Entropy

Carlo Bertoluzza and Bruno Forte
The Annals of Probability
Vol. 13, No. 2 (May, 1985), pp. 630-637
Stable URL: http://www.jstor.org/stable/2243816
Page Count: 8
  • Read Online (Free)
  • Download ($19.00)
  • Subscribe ($19.50)
  • Cite this Item
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Mutual Dependence of Random Variables and Maximum Discretized Entropy
Preview not available

Abstract

In connection with a random vector (X, Y) in the unit square Q and a couple (m, n) of positive integers, we consider all discretizations of the continuous probability distribution of (X, Y) that are obtained by an m × n cartesian decomposition of Q. We prove that Y is a (continuous and invertible) function of X if and only if for each m, n the maximum entropy of the finite distributions equals log(m + n - 1)

Page Thumbnails

  • Thumbnail: Page 
630
    630
  • Thumbnail: Page 
631
    631
  • Thumbnail: Page 
632
    632
  • Thumbnail: Page 
633
    633
  • Thumbnail: Page 
634
    634
  • Thumbnail: Page 
635
    635
  • Thumbnail: Page 
636
    636
  • Thumbnail: Page 
637
    637