Access

You are not currently logged in.

Access your personal account or get JSTOR access through your library or other institution:

login

Log in to your personal account or through your institution.

A Method for Dimension Reduction in Quadratic Classification Problems

Santiago Velilla
Journal of Computational and Graphical Statistics
Vol. 17, No. 3 (Sep., 2008), pp. 572-589
Stable URL: http://www.jstor.org/stable/27594326
Page Count: 18
  • Download ($14.00)
  • Cite this Item
A Method for Dimension Reduction in Quadratic Classification Problems
Preview not available

Abstract

This article presents a dimension-reduction method in quadratic discriminant analysis (QDA). The procedure is inspired by the geometric relation that exists between the subspaces used in sliced inverse regression (SIR) and sliced average variance estimation (SAVE). A new set of directions is constructed to improve the properties of the directions associated with the eigenvectors of the matrices usually considered for dimension reduction in QDA. Illustrative examples of application with real and simulated data are discussed.

Page Thumbnails

  • Thumbnail: Page 
572
    572
  • Thumbnail: Page 
573
    573
  • Thumbnail: Page 
574
    574
  • Thumbnail: Page 
575
    575
  • Thumbnail: Page 
576
    576
  • Thumbnail: Page 
577
    577
  • Thumbnail: Page 
578
    578
  • Thumbnail: Page 
579
    579
  • Thumbnail: Page 
580
    580
  • Thumbnail: Page 
581
    581
  • Thumbnail: Page 
582
    582
  • Thumbnail: Page 
583
    583
  • Thumbnail: Page 
584
    584
  • Thumbnail: Page 
585
    585
  • Thumbnail: Page 
586
    586
  • Thumbnail: Page 
587
    587
  • Thumbnail: Page 
588
    588
  • Thumbnail: Page 
589
    589