Abstract

A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters is adjusted automatically to match the complexity of the problem. The solution is expressed as a linear combination of supporting patterns. These are the subset of training patterns that are closest to the decision boundary. Bounds on the generalization performance based on the leave-one-out method and the VC-dimension are given. Experimental results on optical character recognition problems demonstrate the good generalization obtained when compared with other learning algorithms.

Keywords

Margin (machine learning)Decision boundaryGeneralizationPerceptronComputer scienceAlgorithmDimension (graph theory)Artificial intelligenceBoundary (topology)Variety (cybernetics)Pattern recognition (psychology)Machine learningMathematicsArtificial neural networkSupport vector machine

Affiliated Institutions

Related Publications

Publication Info

Year
1992
Type
article
Pages
144-152
Citations
11404
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

11404
OpenAlex

Cite This

Bernhard E. Boser, Isabelle Guyon, Vladimir Vapnik (1992). A training algorithm for optimal margin classifiers. , 144-152. https://doi.org/10.1145/130385.130401

Identifiers

DOI
10.1145/130385.130401