## Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

# On Maximum (Information-Theoretic) Entropy Estimation

M. Dutta
Sankhyā: The Indian Journal of Statistics, Series A (1961-2002)
Vol. 28, No. 4 (Dec., 1966), pp. 319-328
Stable URL: http://www.jstor.org/stable/25049432
Page Count: 10
Preview not available

## Abstract

It is known that, as statistical methods, the maximum entropy estimation is equivalent to the maximum likelihood estimation with an exponential distribution. But, it is seen that the relation between them is much deeper. ${\textstyle\frac{1}{N}}$ times of the logarithm of likelihood function for a sample converges in probability to the negative of the (information-theoretic) entropy of the population for sufficiently large size, N, of samples. Thus, for samples, the entropy (information) can also be introduced by the likelihood function. Then, a discussion of this information and Fisher's information appears to be interesting and is made here. Further, it is seen that as statistical methods, (i) the maximum entropy estimation, (ii) the maximum likelihood estimation for an exponential distribution, (iii) the specification and estimation by the Gauss principle of arithmetic mean are equivalent.

• 319
• 320
• 321
• 322
• 323
• 324
• 325
• 326
• 327
• 328