Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support

Estimating a Regression Function

Sara van de Geer
The Annals of Statistics
Vol. 18, No. 2 (Jun., 1990), pp. 907-924
Stable URL: http://www.jstor.org/stable/2242140
Page Count: 18
  • Read Online (Free)
  • Download ($19.00)
  • Cite this Item
If you need an accessible version of this item please contact JSTOR User Support
Estimating a Regression Function
Preview not available

Abstract

In this paper, an entropy approach is proposed to establish rates of convergence for estimators of a regression function. General regression problems are considered, with linear regression, splines and isotonic regression as special cases. The estimation methods studied are least squares, least absolute deviations and penalized least squares. Common features of these methods and various regression problems are highlighted.

Page Thumbnails

  • Thumbnail: Page 
907
    907
  • Thumbnail: Page 
908
    908
  • Thumbnail: Page 
909
    909
  • Thumbnail: Page 
910
    910
  • Thumbnail: Page 
911
    911
  • Thumbnail: Page 
912
    912
  • Thumbnail: Page 
913
    913
  • Thumbnail: Page 
914
    914
  • Thumbnail: Page 
915
    915
  • Thumbnail: Page 
916
    916
  • Thumbnail: Page 
917
    917
  • Thumbnail: Page 
918
    918
  • Thumbnail: Page 
919
    919
  • Thumbnail: Page 
920
    920
  • Thumbnail: Page 
921
    921
  • Thumbnail: Page 
922
    922
  • Thumbnail: Page 
923
    923
  • Thumbnail: Page 
924
    924