Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support

Bounds on the Efficiency of Linear Predictions Using an Incorrect Covariance Function

Michael L. Stein
The Annals of Statistics
Vol. 18, No. 3 (Sep., 1990), pp. 1116-1138
Stable URL: http://www.jstor.org/stable/2242045
Page Count: 23
  • Read Online (Free)
  • Download ($19.00)
  • Cite this Item
If you need an accessible version of this item please contact JSTOR User Support
Bounds on the Efficiency of Linear Predictions Using an Incorrect Covariance Function
Preview not available

Abstract

Suppose z(·) is a random process defined on a bounded set $R \subset \mathbb{R}^1$ with finite second moments. Consider the behavior of linear predictions based on z(t1), ..., z(tn), where t1, t2, ⋯ is a dense sequence of points in R. Stein showed that if the second-order structure used to generate the predictions is incorrect but compatible with the correct second-order structure, the obtained predictions are uniformly asymptotically optimal as n → ∞. In the present paper, a general method is described for obtaining rates of convergence when the covariance function is misspecified but compatible with the correct covariance function. When z(·) is Gaussian, these bounds are related to the entropy distance (the symmetrized Kullback divergence) between the measures for the random field under the actual and presumed covariance functions. Explicit bounds are given when R = [ 0, 1] and z(·) is stationary with spectral density of the form f(λ) = (a2 + λ2)-p, where p is a known positive integer and a is the parameter that is misspecified. More precise results are given in the case p = 1. An application of this result implies that equally spaced observations are asymptotically optimal in the sense used by Sacks and Ylvisaker in terms of maximizing the Kullback divergence between the actual and presumed models when z(·) is Gaussian.

Page Thumbnails

  • Thumbnail: Page 
1116
    1116
  • Thumbnail: Page 
1117
    1117
  • Thumbnail: Page 
1118
    1118
  • Thumbnail: Page 
1119
    1119
  • Thumbnail: Page 
1120
    1120
  • Thumbnail: Page 
1121
    1121
  • Thumbnail: Page 
1122
    1122
  • Thumbnail: Page 
1123
    1123
  • Thumbnail: Page 
1124
    1124
  • Thumbnail: Page 
1125
    1125
  • Thumbnail: Page 
1126
    1126
  • Thumbnail: Page 
1127
    1127
  • Thumbnail: Page 
1128
    1128
  • Thumbnail: Page 
1129
    1129
  • Thumbnail: Page 
1130
    1130
  • Thumbnail: Page 
1131
    1131
  • Thumbnail: Page 
1132
    1132
  • Thumbnail: Page 
1133
    1133
  • Thumbnail: Page 
1134
    1134
  • Thumbnail: Page 
1135
    1135
  • Thumbnail: Page 
1136
    1136
  • Thumbnail: Page 
1137
    1137
  • Thumbnail: Page 
1138
    1138