## Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

## If You Use a Screen Reader

This content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.

# Uniformly Minimum Variance Estimation in Location Parameter Families

Lennart Bondesson
The Annals of Statistics
Vol. 3, No. 3 (May, 1975), pp. 637-660
Stable URL: http://www.jstor.org/stable/2958433
Page Count: 24
Preview not available

## Abstract

Let x1, ⋯, xn be a sample of size n of an $\operatorname{rv}$ with $\operatorname{df} F(x - \theta)$, where F is known but θ unknown. In this paper we make a Fourier approach to the problem of existence of a statistic g(x1, ⋯, xn) which is a uniformly minimum variance (UMV) estimator of its own mean value. We mention only some of the results. If n = 1 we find an NASC for an estimator g(x1) to be, in a restricted sense, UMV. This condition is given in terms of the zeroes of the ch.f. of F and the support of the Fourier transform of g. If n ≥ 2, it is shown that a statistic of the form g(x̄), where x̄ is the sample mean, cannot be UMV, unless g is periodic or F is a normal $\operatorname{df}$. We prove the non-existence of a UMV-estimator of θ, provided that the tail of F tends to zero rapidly enough. Finally, it is proved that no polynomial P(x1, ⋯, xn) can be a UMV-estimator, unless F is a normal $\operatorname{df}$.

• 637
• 638
• 639
• 640
• 641
• 642
• 643
• 644
• 645
• 646
• 647
• 648
• 649
• 650
• 651
• 652
• 653
• 654
• 655
• 656
• 657
• 658
• 659
• 660