You are not currently logged in.
Access JSTOR through your library or other institution:
If You Use a Screen ReaderThis content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Majorization, Randomness and Dependence for Multivariate Distributions
The Annals of Probability
Vol. 15, No. 3 (Jul., 1987), pp. 1217-1225
Published by: Institute of Mathematical Statistics
Stable URL: http://www.jstor.org/stable/2244051
Page Count: 9
You can always find the topics here!Topics: Density, Randomness, Mathematical inequalities, Lebesgue measures, Mathematical theorems, Entropy, Random variables, Mathematical minima, Correlation coefficients, Statistical theories
Were these topics helpful?See somethings inaccurate? Let us know!
Select the topics that are inaccurate.
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Preview not available
The preorder relation of Hardy, Littlewood and Polya (1929), Day (1973) and Chong (1974, 1976) is applied to multivariate probability densities. This preorder, which is called majorization here, can be interpreted as an ordering of randomness. When used to compare multivariate densities with the same marginal densities, it can be interpreted as an ordering of dependence or conditional dependence. Results in Hickey (1983, 1984) and Joe (1985) are generalized. A relative entropy function is proposed as a measure of dependence or conditional dependence for multivariate densities with the same marginals.
The Annals of Probability © 1987 Institute of Mathematical Statistics