If you need an accessible version of this item please contact JSTOR User Support

Rates of Convergence of Estimates, Kolmogorov's Entropy and the Dimensionality Reduction Principle in Regression

Theodoros Nicoleris and Yannis G. Yatracos
The Annals of Statistics
Vol. 25, No. 6 (Dec., 1997), pp. 2493-2511
Stable URL: http://www.jstor.org/stable/2959042
Page Count: 19
  • Download PDF
  • Cite this Item

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

If you need an accessible version of this item please contact JSTOR User Support
Rates of Convergence of Estimates, Kolmogorov's Entropy and the Dimensionality Reduction Principle in Regression
Preview not available

Abstract

$L_1$-optimal minimum distance estimators are provided for a projection pursuit regression type function with smooth functional components that are either additive or multiplicative, in the presence of or without interactions. The obtained rates of convergence of the estimate to the true parameter depend on Kolmogorov's entropy of the assumed model and confirm Stone's heuristic dimensionality reduction principle. Rates of convergence are also obtained for the error in estimating the derivatives of a regression type function.

Page Thumbnails

  • Thumbnail: Page 
2493
    2493
  • Thumbnail: Page 
2494
    2494
  • Thumbnail: Page 
2495
    2495
  • Thumbnail: Page 
2496
    2496
  • Thumbnail: Page 
2497
    2497
  • Thumbnail: Page 
2498
    2498
  • Thumbnail: Page 
2499
    2499
  • Thumbnail: Page 
2500
    2500
  • Thumbnail: Page 
2501
    2501
  • Thumbnail: Page 
2502
    2502
  • Thumbnail: Page 
2503
    2503
  • Thumbnail: Page 
2504
    2504
  • Thumbnail: Page 
2505
    2505
  • Thumbnail: Page 
2506
    2506
  • Thumbnail: Page 
2507
    2507
  • Thumbnail: Page 
2508
    2508
  • Thumbnail: Page 
2509
    2509
  • Thumbnail: Page 
2510
    2510
  • Thumbnail: Page 
2511
    2511