Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support

Entropies and Rates of Convergence for Maximum Likelihood and Bayes Estimation for Mixtures of Normal Densities

Subhashis Ghosal and Aad W. van der Vaart
The Annals of Statistics
Vol. 29, No. 5 (Oct., 2001), pp. 1233-1263
Stable URL: http://www.jstor.org/stable/2699987
Page Count: 31
  • Read Online (Free)
  • Download ($19.00)
  • Subscribe ($19.50)
  • Cite this Item
If you need an accessible version of this item please contact JSTOR User Support
Entropies and Rates of Convergence for Maximum Likelihood and Bayes Estimation for Mixtures of Normal Densities
Preview not available

Abstract

We study the rates of convergence of the maximum likelihood estimator (MLE) and posterior distribution in density estimation problems, where the densities are location or location-scale mixtures of normal distributions with the scale parameter lying between two positive numbers. The true density is also assumed to lie in this class with the true mixing distribution either compactly supported or having sub-Gaussian tails. We obtain bounds for Hellinger bracketing entropies for this class, and from these bounds, we deduce the convergence rates of (sieve) MLEs in Hellinger distance. The rate turns out to be (log n)$^\kappa/\sqrt{n}$, where κ≥ 1 is a constant that depends on the type of mixtures and the choice of the sieve. Next, we consider a Dirichlet mixture of normals as a prior on the unknown density. We estimate the prior probability of a certain Kullback-Leibler type neighborhood and then invoke a general theorem that computes the posterior convergence rate in terms the growth rate of the Hellinger entropy and the concentration rate of the prior. The posterior distribution is also seen to converge at the rate (log n)$^\kappa/\sqrt{n}$ in, where κ now depends on the tail behavior of the base measure of the Dirichlet process.

Page Thumbnails

  • Thumbnail: Page 
1233
    1233
  • Thumbnail: Page 
1234
    1234
  • Thumbnail: Page 
1235
    1235
  • Thumbnail: Page 
1236
    1236
  • Thumbnail: Page 
1237
    1237
  • Thumbnail: Page 
1238
    1238
  • Thumbnail: Page 
1239
    1239
  • Thumbnail: Page 
1240
    1240
  • Thumbnail: Page 
1241
    1241
  • Thumbnail: Page 
1242
    1242
  • Thumbnail: Page 
1243
    1243
  • Thumbnail: Page 
1244
    1244
  • Thumbnail: Page 
1245
    1245
  • Thumbnail: Page 
1246
    1246
  • Thumbnail: Page 
1247
    1247
  • Thumbnail: Page 
1248
    1248
  • Thumbnail: Page 
1249
    1249
  • Thumbnail: Page 
1250
    1250
  • Thumbnail: Page 
1251
    1251
  • Thumbnail: Page 
1252
    1252
  • Thumbnail: Page 
1253
    1253
  • Thumbnail: Page 
1254
    1254
  • Thumbnail: Page 
1255
    1255
  • Thumbnail: Page 
1256
    1256
  • Thumbnail: Page 
1257
    1257
  • Thumbnail: Page 
1258
    1258
  • Thumbnail: Page 
1259
    1259
  • Thumbnail: Page 
1260
    1260
  • Thumbnail: Page 
1261
    1261
  • Thumbnail: Page 
1262
    1262
  • Thumbnail: Page 
1263
    1263