You are not currently logged in.
Access JSTOR through your library or other institution:
If You Use a Screen ReaderThis content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
A Class of Local Likelihood Methods and Near-Parametric Asymptotics
Shinto Eguchi and John Copas
Journal of the Royal Statistical Society. Series B (Statistical Methodology)
Vol. 60, No. 4 (1998), pp. 709-724
Stable URL: http://www.jstor.org/stable/2985958
Page Count: 16
You can always find the topics here!Topics: Maximum likelihood estimation, Gaussian distributions, Density estimation, Estimators, Statistical variance, Statism, Estimation methods, Simulations, Statistical estimation, Statistical models
Were these topics helpful?See something inaccurate? Let us know!
Select the topics that are inaccurate.
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Preview not available
The local maximum likelihood estimate
$\hat\theta_t$ of a parameter in a statistical model f(x,θ) is defined by maximizing a weighted version of the likelihood function which gives more weight to observations in the neighbourhood of t. The paper studies the sense in which $f(t,\hat\theta_t)$ is closer to the true distribution g(t) than the usual estimate $f(t,\hat\theta)$ is. Asymptotic results are presented for the case in which the model misspecification becomes vanishingly small as the sample size tends to ∞. In this setting, the relative entropy risk of the local method is better than that of maximum likelihood. The form of optimum weights for the local likelihood is obtained and illustrated for the normal distribution.
Journal of the Royal Statistical Society. Series B (Statistical Methodology) © 1998 Royal Statistical Society