Abstract

Kernel-based linear-threshold algorithms, such as support vector machines and Perceptron-like algorithms, are among the best available techniques for solving pattern classification problems. In this paper, we describe an extension of the classical Perceptron algorithm, called second-order Perceptron, and analyze its performance within the mistake bound model of on-line learning. The bound achieved by our algorithm depends on the sensitivity to second-order data information and is the best known mistake bound for (efficient) kernel-based linear-threshold classifiers to date. This mistake bound, which strictly generalizes the well-known Perceptron bound, is expressed in terms of the eigenvalues of the empirical data correlation matrix and depends on a parameter controlling the sensitivity of the algorithm to the distribution of these eigenvalues. Since the optimal setting of this parameter is not known a priori, we also analyze two variants of the second-order Perceptron algorithm: one that adaptively sets the value of the parameter in terms of the number of mistakes made so far, and one that is parameterless, based on pseudoinverses.

Keywords

PerceptronAlgorithmA priori and a posterioriMathematicsUpper and lower boundsMultilayer perceptronKernel (algebra)Artificial intelligenceComputer scienceArtificial neural networkDiscrete mathematics

Affiliated Institutions

Related Publications

Publication Info

Year
2005
Type
article
Volume
34
Issue
3
Pages
640-668
Citations
198
Access
Closed

External Links

Social Impact

Altmetric

Social media, news, blog, policy document mentions

Citation Metrics

198
OpenAlex

Cite This

Nicolò Cesa‐Bianchi, Alex Conconi, Claudio Gentile (2005). A Second-Order Perceptron Algorithm. SIAM Journal on Computing , 34 (3) , 640-668. https://doi.org/10.1137/s0097539703432542

Identifiers

DOI
10.1137/s0097539703432542