You are not currently logged in.
Access JSTOR through your library or other institution:
If You Use a Screen ReaderThis content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Estimating a Regression Function
Sara van de Geer
The Annals of Statistics
Vol. 18, No. 2 (Jun., 1990), pp. 907-924
Published by: Institute of Mathematical Statistics
Stable URL: http://www.jstor.org/stable/2242140
Page Count: 18
You can always find the topics here!Topics: Entropy, Least squares, Estimators, Linear regression, Mathematical functions, Statistical estimation, Estimation methods, Empiricism, Linear transformations
Were these topics helpful?See somethings inaccurate? Let us know!
Select the topics that are inaccurate.
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Preview not available
In this paper, an entropy approach is proposed to establish rates of convergence for estimators of a regression function. General regression problems are considered, with linear regression, splines and isotonic regression as special cases. The estimation methods studied are least squares, least absolute deviations and penalized least squares. Common features of these methods and various regression problems are highlighted.
The Annals of Statistics © 1990 Institute of Mathematical Statistics