You are not currently logged in.
Access JSTOR through your library or other institution:
If You Use a Screen ReaderThis content is available through Read Online (Free) program, which relies on page scans. Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
The High-Dimension, Low-Sample-Size Geometric Representation Holds under Mild Conditions
Jeongyoun Ahn, J. S. Marron, Keith M. Muller and Yueh-Yun Chi
Vol. 94, No. 3 (Aug., 2007), pp. 760-766
Stable URL: http://www.jstor.org/stable/20441411
Page Count: 7
You can always find the topics here!Topics: Eigenvalues, Covariance, Matrices, Sample size, Eigenvectors, Principal components analysis, Mathematical vectors, Vertices, High dimensional spaces, Data visualization
Were these topics helpful?See somethings inaccurate? Let us know!
Select the topics that are inaccurate.
Since scans are not currently available to screen readers, please contact JSTOR User Support for access. We'll provide a PDF copy for your screen reader.
Preview not available
High-dimension, low-small-sample size datasets have different geometrical properties from those of traditional low-dimensional data. In their asymptotic study regarding increasing dimensionality with a fixed sample size, Hall et al. (2005) showed that each data vector is approximately located on the vertices of a regular simplex in a high-dimensional space. A perhaps unappealing aspect of their result is the underlying assumption which requires the variables, viewed as a time series, to be almost independent. We establish an equivalent geometric representation under much milder conditions using asymptotic properties of sample covariance matrices. We discuss implications of the results, such as the use of principal component analysis in a high-dimensional space, extension to the case of nonindependent samples and also the binary classification problem.
Biometrika © 2007 Biometrika Trust