Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If You Use a Screen Reader

This content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.

Rates of Convergence of Minimum Distance Estimators and Kolmogorov's Entropy

Yannis G. Yatracos
The Annals of Statistics
Vol. 13, No. 2 (Jun., 1985), pp. 768-774
Stable URL: http://www.jstor.org/stable/2241209
Page Count: 7
  • Read Online (Free)
  • Download ($19.00)
  • Subscribe ($19.50)
  • Cite this Item
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Rates of Convergence of Minimum Distance Estimators and Kolmogorov's Entropy
Preview not available

Abstract

Let (X, A) be a space with a σ-field, M = {Ps; s ∈ Θ} be a family of probability measures on A with Θ arbitrary, X1, ⋯, Xn i.i.d. observations on Pθ. Define μn(A) = (1/n) ∑n i = 1 IA(Xi), the empirical measure indexed by A ∈ A. Assume Θ is totally bounded when metrized by the L1 distance between measures. Robust minimum distance estimators θ̂n are constructed for θ and the resulting rate of convergence is shown naturally to depend on an entropy function for Θ.

Page Thumbnails

  • Thumbnail: Page 
768
    768
  • Thumbnail: Page 
769
    769
  • Thumbnail: Page 
770
    770
  • Thumbnail: Page 
771
    771
  • Thumbnail: Page 
772
    772
  • Thumbnail: Page 
773
    773
  • Thumbnail: Page 
774
    774