You are not currently logged in.
Access JSTOR through your library or other institution:
On L1-Norm Multiclass Support Vector Machines: Methodology and Theory
Lifeng Wang and Xiaotong Shen
Journal of the American Statistical Association
Vol. 102, No. 478 (Jun., 2007), pp. 583-594
Stable URL: http://www.jstor.org/stable/27639888
Page Count: 12
You can always find the topics here!Topics: Ova, Mathematical vectors, Error rates, Simulations, Learning theory, Entropy, Statistics, Bayes rule, Standard error, Information classification
Were these topics helpful?See something inaccurate? Let us know!
Select the topics that are inaccurate.
Preview not available
Binary support vector machines (SVMs) have been proven to deliver high performance. In multiclass classification, however, issues remain with respect to variable selection. One challenging issue is classification and variable selection in the presence of variables in the magnitude of thousands, greatly exceeding the size of training sample. This often occurs in genomics classification. To meet the challenge, this article proposes a novel multiclass support vector machine, which performs classification and variable selection simultaneously through an L1-norm penalized sparse representation. The proposed methodology, together with the developed regularization solution path, permits variable selection in such a situation. For the proposed methodology, a statistical learning theory is developed to quantify the generalization error in an attempt to gain insight into the basic structure of sparse learning, permitting the number of variables to greatly exceed the sample size. The operating characteristics of the methodology are examined through both simulated and benchmark data and are compared against some competitors in terms of accuracy of prediction. The numerical results suggest that the proposed methodology is highly competitive.
Journal of the American Statistical Association © 2007 American Statistical Association