Abstract

Although the detection of invariant structure in a given set of input patterns is vital to many recognition tasks, connectionist learning rules tend to focus on directions of high variance (principal components). The prediction paradigm is often used to reconcile this dichotomy; here we suggest a more direct approach to invariant learning based on an anti-Hebbian learning rule. An unsupervised two-layer network implementing this method in a competitive setting learns to extract coherent depth information from random-dot stereograms.

Keywords

Hebbian theoryCompetitive learningConnectionismComputer scienceArtificial intelligenceUnsupervised learningLeabraInvariant (physics)Set (abstract data type)Variance (accounting)Focus (optics)Feature learningArtificial neural networkMachine learningPattern recognition (psychology)MathematicsGeneralization errorWake-sleep algorithm

Affiliated Institutions

Related Publications

Publication Info

Year
1991
Type
article
Volume
4
Pages
1017-1024
Citations
26
Access
Closed

External Links

Citation Metrics

26
OpenAlex

Cite This

Nicol N. Schraudolph, Terrence J. Sejnowski (1991). Competitive Anti-Hebbian Learning of Invariants. , 4 , 1017-1024.