Abstract

Valiant's learnability model is extended to learning classes of concepts defined by regions in Euclidean space E n . The methods in this paper lead to a unified treatment of some of Valiant's results, along with previous results on distribution-free convergence of certain pattern recognition algorithms. It is shown that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned. Using this parameter, the complexity and closure properties of learnable classes are analyzed, and the necessary and sufficient conditions are provided for feasible learnability.

Keywords

LearnabilityVC dimensionDimension (graph theory)MathematicsSimple (philosophy)Closure (psychology)Euclidean spaceComputer scienceClass (philosophy)Theoretical computer scienceDiscrete mathematicsArtificial intelligenceCombinatorics

Affiliated Institutions

Related Publications

Publication Info

Year
1989
Type
article
Volume
36
Issue
4
Pages
929-965
Citations
1830
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1830
OpenAlex

Cite This

Anselm Blumer, Andrzej Ehrenfeucht, David Haussler et al. (1989). Learnability and the Vapnik-Chervonenkis dimension. Journal of the ACM , 36 (4) , 929-965. https://doi.org/10.1145/76359.76371

Identifiers

DOI
10.1145/76359.76371