You are not currently logged in.
Access JSTOR through your library or other institution:
If You Use a Screen ReaderThis content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Certain Inequalities in Information Theory and the Cramer-Rao Inequality
The Annals of Mathematical Statistics
Vol. 25, No. 4 (Dec., 1954), pp. 745-751
Published by: Institute of Mathematical Statistics
Stable URL: http://www.jstor.org/stable/2236658
Page Count: 7
You can always find the topics here!Topics: Estimators, Statistical variance, Tensors, Mathematical minima, Statistical estimation, Information theory, Probabilities, Unbiased estimators, Point estimators
Were these topics helpful?See something inaccurate? Let us know!
Select the topics that are inaccurate.
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Preview not available
The Cramer-Rao inequality provides, under certain regularity conditions, a lower bound for the variance of an estimator , . Various generalizations, extensions and improvements in the bound have been made, by Barankin , , Bhattacharyya , Chapman and Robbins , Fraser and Guttman , Kiefer , and Wolfowitz , among others. Further considerations of certain inequality properties of a measure of information, discussed by Kullback and Leibler , yields a greater lower bound for the information measure (formula (4.11)), and leads to a result which may be considered a generalization of the Cramer-Rao inequality, the latter following as a special case. The results are used to define discrimination efficiency and estimation efficiency at a point in parameter space.
The Annals of Mathematical Statistics © 1954 Institute of Mathematical Statistics