# Entropies and Rates of Convergence for Maximum Likelihood and Bayes Estimation for Mixtures of Normal Densities

Subhashis Ghosal and Aad W. van der Vaart
The Annals of Statistics
Vol. 29, No. 5 (Oct., 2001), pp. 1233-1263
Stable URL: http://www.jstor.org/stable/2699987
Page Count: 31

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

## Abstract

We study the rates of convergence of the maximum likelihood estimator (MLE) and posterior distribution in density estimation problems, where the densities are location or location-scale mixtures of normal distributions with the scale parameter lying between two positive numbers. The true density is also assumed to lie in this class with the true mixing distribution either compactly supported or having sub-Gaussian tails. We obtain bounds for Hellinger bracketing entropies for this class, and from these bounds, we deduce the convergence rates of (sieve) MLEs in Hellinger distance. The rate turns out to be (log n)$^\kappa/\sqrt{n}$, where κ≥ 1 is a constant that depends on the type of mixtures and the choice of the sieve. Next, we consider a Dirichlet mixture of normals as a prior on the unknown density. We estimate the prior probability of a certain Kullback-Leibler type neighborhood and then invoke a general theorem that computes the posterior convergence rate in terms the growth rate of the Hellinger entropy and the concentration rate of the prior. The posterior distribution is also seen to converge at the rate (log n)$^\kappa/\sqrt{n}$ in, where κ now depends on the tail behavior of the base measure of the Dirichlet process.

• 1233
• 1234
• 1235
• 1236
• 1237
• 1238
• 1239
• 1240
• 1241
• 1242
• 1243
• 1244
• 1245
• 1246
• 1247
• 1248
• 1249
• 1250
• 1251
• 1252
• 1253
• 1254
• 1255
• 1256
• 1257
• 1258
• 1259
• 1260
• 1261
• 1262
• 1263