Abstract

Learning from hints is a generalization of learning from examples that allows for a variety of information about the unknown function to be used in the learning process. In this paper, we use the VC dimension, an established tool for analyzing learning from examples, to analyze learning from hints. In particular, we show how the VC dimension is affected by the introduction of a hint. We also derive a new quantity that defines a VC dimension for the hint itself. This quantity is used to estimate the number of examples needed to "absorb" the hint. We carry out the analysis for two types of hints, invariances and catalysts. We also describe how the same method can be applied to other types of hints.

Keywords

Dimension (graph theory)GeneralizationVC dimensionVariety (cybernetics)Computer scienceFunction (biology)Carry (investment)Process (computing)Theoretical computer scienceMachine learningArtificial intelligenceMathematicsCombinatoricsProgramming language

Affiliated Institutions

Related Publications

Statistical Learning Theory

A comprehensive look at learning and generalization theory. The statistical theory of learning and generalization concerns the problem of choosing desired functions on the basis...

1999 Technometrics 26913 citations

Publication Info

Year
1993
Type
article
Volume
5
Issue
2
Pages
278-288
Citations
81
Access
Closed

External Links

Social Impact

Altmetric

Social media, news, blog, policy document mentions

Citation Metrics

81
OpenAlex

Cite This

Yaser S. Abu‐Mostafa (1993). Hints and the VC Dimension. Neural Computation , 5 (2) , 278-288. https://doi.org/10.1162/neco.1993.5.2.278

Identifiers

DOI
10.1162/neco.1993.5.2.278