Abstract

The support vector machine (SVM) is known for its good performance in two-class classification, but its extension to multiclass classification is still an ongoing research issue. In this article, we propose a new approach for classification, called the import vector machine (IVM), which is built on kernel logistic regression (KLR). We show that the IVM not only performs as well as the SVM in two-class classification, but also can naturally be generalized to the multiclass case. Furthermore, the IVM provides an estimate of the underlying probability. Similar to the support points of the SVM, the IVM model uses only a fraction of the training data to index kernel basis functions, typically a much smaller fraction than the SVM. This gives the IVM a potential computational advantage over the SVM.

Keywords

Support vector machineStructured support vector machineKernel (algebra)Artificial intelligenceComputer scienceBinary classificationLogistic regressionMachine learningRelevance vector machinePattern recognition (psychology)Kernel methodPolynomial kernelRadial basis function kernelData miningMathematics

Affiliated Institutions

Related Publications

Publication Info

Year
2001
Type
article
Volume
14
Pages
1081-1088
Citations
136
Access
Closed

External Links

Citation Metrics

136
OpenAlex

Cite This

Ji Zhu, Trevor Hastie (2001). Kernel Logistic Regression and the Import Vector Machine. , 14 , 1081-1088.