Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If You Use a Screen Reader

This content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.

Linear Spaces and Unbiased Estimation--Application to the Mixed Linear Model

Justus Seely
The Annals of Mathematical Statistics
Vol. 41, No. 5 (Oct., 1970), pp. 1735-1748
Stable URL: http://www.jstor.org/stable/2239880
Page Count: 14
  • Read Online (Free)
  • Download ($19.00)
  • Subscribe ($19.50)
  • Cite this Item
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Linear Spaces and Unbiased Estimation--Application to the Mixed Linear Model
Preview not available

Abstract

Exemplification of the theory developed in [9] using a linear space of random variables other than linear combinations of the components of a random vector, and unbiased estimation for the parameters of a mixed linear model using quadratic estimators are the primary reasons for the considerations in this paper. For a random vector Y with expectation Xβ and covariance matrix ∑iνiV i (ν1, ⋯, νm, and β denote the parameters), interest centers upon quadratic estimability for parametric functions of the form ∑i≤ jγijβiβj + ∑kγkν k and procedures for obtaining quadratic estimators for such parametric functions. Special emphasis is given to parametric functions of the form ∑kγkνk. Unbiased estimation of variance components is the main reason for quadratic estimability considerations regarding parametric functions of the form ∑kγkνk. Concerning variance component models, Airy, in 1861 (Scheffe [6]), appears to have been the first to introduce a model with more than one source of variation. Such a model is also implied (Scheffe [6]) by Chauvenet in 1863. Fisher [1], [2] reintroduced variance component models and discussed, apparently for the first time, unbiased estimation in such models. Since Fisher's introduction and discussion of unbiased estimation in models with more than one source of variation, there has been considerable literature published on the subject. One of these papers is a description by Henderson [5] which popularized three methods (now known as Henderson's Methods I, II, and III) for obtaining unbiased estimates of variance components. We mention these methods since they seem to be commonly used in the estimation of variance components. For a review as well as a matrix formulation of the methods see Searle [7]. Among the several pieces of work which have dealt with Henderson's methods, only that of Harville [4] seems to have been concerned with consistency of the equations leading to the estimators and to the existence of unbiased (quadratic) estimators under various conditions. Harville, however, only treats a completely random two-way classification model with interaction. One other result which deals with existence of unbiased quadratic estimators in a completely random model is given by Graybill and Hultquist [3]. In Section 2 the form we assume for a mixed linear model is introduced and the pertinent quantiles needed for the application of the results in [9] are obtained. Definitions, terminology, and notation are consistent with the usage in [9]. Section 3 considers parametric functions of the form ∑i≤ jγijβiβj + ∑kγkνk and Section 4 concerns parametric functions of the form ∑kγkνk. One particular method for obtaining unbiased estimators for linear combinations of variance components is given in Section 4 that is computationally simpler than the Henderson Method III procedure which is the most widely used general approach applicable to any mixed linear model. The method described in Section 4 has the added advantage of giving necessary and sufficient conditions for the existence of unbiased quadratic estimators which is not always the case with the Henderson Method III. In the last section an example is given which illustrates the Henderson Method III procedure from the viewpoint of this paper.

Page Thumbnails

  • Thumbnail: Page 
1735
    1735
  • Thumbnail: Page 
1736
    1736
  • Thumbnail: Page 
1737
    1737
  • Thumbnail: Page 
1738
    1738
  • Thumbnail: Page 
1739
    1739
  • Thumbnail: Page 
1740
    1740
  • Thumbnail: Page 
1741
    1741
  • Thumbnail: Page 
1742
    1742
  • Thumbnail: Page 
1743
    1743
  • Thumbnail: Page 
1744
    1744
  • Thumbnail: Page 
1745
    1745
  • Thumbnail: Page 
1746
    1746
  • Thumbnail: Page 
1747
    1747
  • Thumbnail: Page 
1748
    1748