You are not currently logged in.
Access JSTOR through your library or other institution:
Information and the Likelihood Function in Exponential Families
Robert E. McCulloch
The American Statistician
Vol. 42, No. 1 (Feb., 1988), pp. 73-75
Stable URL: http://www.jstor.org/stable/2685266
Page Count: 3
You can always find the topics here!Topics: Maximum likelihood estimation, Maximum likelihood estimators, Mathematical functions, Covariance, Statistics, Mathematical vectors, Random variables, Information theory, Interior points, Density
Were these topics helpful?See somethings inaccurate? Let us know!
Select the topics that are inaccurate.
Preview not available
A very strong relationship between the directed Kullback divergence and the likelihood function emerges simply and naturally in the context of exponential families. The directed Kullback divergence is an information-theoretic measure of the distance from one distribution to another.
The American Statistician © 1988 American Statistical Association