You are not currently logged in.
Access JSTOR through your library or other institution:
If You Use a Screen ReaderThis content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Information Inequalities and Concentration of Measure
The Annals of Probability
Vol. 25, No. 2 (Apr., 1997), pp. 927-939
Published by: Institute of Mathematical Statistics
Stable URL: http://www.jstor.org/stable/2959616
Page Count: 13
You can always find the topics here!Topics: Mathematical inequalities, Entropy, Logical proofs, Transportation
Were these topics helpful?See somethings inaccurate? Let us know!
Select the topics that are inaccurate.
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Preview not available
We derive inequalities of the form Δ(P, Q) ≤ H(P|R) + H(Q|R) which hold for every choice of probability measures P, Q, R, where H(P∣ R) denotes the relative entropy of P with respect to R and Δ(P, Q) stands for a coupling type "distance" between P and Q. Using the chain rule for relative entropies and then specializing to Q with a given support we recover some of Talagrand's concentration of measure inequalities for product spaces.
The Annals of Probability © 1997 Institute of Mathematical Statistics