Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

On Maximum (Information-Theoretic) Entropy Estimation

M. Dutta
Sankhyā: The Indian Journal of Statistics, Series A (1961-2002)
Vol. 28, No. 4 (Dec., 1966), pp. 319-328
Published by: Springer on behalf of the Indian Statistical Institute
Stable URL: http://www.jstor.org/stable/25049432
Page Count: 10
  • Download ($43.95)
  • Cite this Item
On Maximum (Information-Theoretic) Entropy Estimation
Preview not available

Abstract

It is known that, as statistical methods, the maximum entropy estimation is equivalent to the maximum likelihood estimation with an exponential distribution. But, it is seen that the relation between them is much deeper. ${\textstyle\frac{1}{N}}$ times of the logarithm of likelihood function for a sample converges in probability to the negative of the (information-theoretic) entropy of the population for sufficiently large size, N, of samples. Thus, for samples, the entropy (information) can also be introduced by the likelihood function. Then, a discussion of this information and Fisher's information appears to be interesting and is made here. Further, it is seen that as statistical methods, (i) the maximum entropy estimation, (ii) the maximum likelihood estimation for an exponential distribution, (iii) the specification and estimation by the Gauss principle of arithmetic mean are equivalent.

Page Thumbnails

  • Thumbnail: Page 
319
    319
  • Thumbnail: Page 
320
    320
  • Thumbnail: Page 
321
    321
  • Thumbnail: Page 
322
    322
  • Thumbnail: Page 
323
    323
  • Thumbnail: Page 
324
    324
  • Thumbnail: Page 
325
    325
  • Thumbnail: Page 
326
    326
  • Thumbnail: Page 
327
    327
  • Thumbnail: Page 
328
    328