Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

Investigating Smooth Multiple Regression by the Method of Average Derivatives

Wolfgang Hardle and Thomas M. Stoker
Journal of the American Statistical Association
Vol. 84, No. 408 (Dec., 1989), pp. 986-995
DOI: 10.2307/2290074
Stable URL: http://www.jstor.org/stable/2290074
Page Count: 10
  • Download ($14.00)
  • Cite this Item
Investigating Smooth Multiple Regression by the Method of Average Derivatives
Preview not available

Abstract

Let (x1,...,xk, y) be a random vector where y denotes a response on the vector x of predictor variables. In this article we propose a technique [termed average derivative estimation (ADE)] for studying the mean response m(x) = E(y∣ x) through the estimation of the k vector of average derivatives δ = E(m'). The ADE procedure involves two stages: first estimate δ using an estimator $\hat\delta$, and then approximate m(x) by $\hat{m}(x) = \hat{g}(x^T\hat\delta)$, where ĝ is an estimator of the univariate regression of y on $x^T\hat\delta$. We argue that the ADE procedure exhibits several attractive characteristics: data summarization through interpretable coefficients, graphical depiction of the possible nonlinearity between y and $x^T\hat\delta$, and theoretical properties consistent with dimension reduction. We motivate the ADE procedure using examples of models that take the form m(x) = g̃(xTβ). In this framework, δ is shown to be proportional to β and m̂(x) infers m(x) exactly. The focus of the procedure is on the estimator $\hat\delta$, which is based on a simple average of kernel smoothers and is shown to be a $\sqrt N$ consistent and asymptotically normal estimator of δ. The estimator ĝ (·) is a standard kernel regression estimator and is shown to have the same properties as the kernel regression of y on xTδ. In sum, the estimator $\hat\delta$ converges to δ at the rate typically available in parametric estimation problems, and m̂(x) converges to E(y∣Tδ) at the optimal one-dimensional nonparametric rate. We also give a consistent estimator of the asymptotic covariance matrix of $\hat\delta$, to facilitate inference. We discuss the conditions underlying these results, including how $\sqrt N$ consistent estimation of $\hat\delta$ requires undersmoothing relative to pointwise multivariate estimation. We also indicate the relationship between the ADE method and projection pursuit regression. For illustration, we apply the ADE method to data on automobile collisions.

Page Thumbnails

  • Thumbnail: Page 
986
    986
  • Thumbnail: Page 
987
    987
  • Thumbnail: Page 
988
    988
  • Thumbnail: Page 
989
    989
  • Thumbnail: Page 
990
    990
  • Thumbnail: Page 
991
    991
  • Thumbnail: Page 
992
    992
  • Thumbnail: Page 
993
    993
  • Thumbnail: Page 
994
    994
  • Thumbnail: Page 
995
    995