Abstract

We compare discriminative and generative learning as typied by logistic regression and naive Bayes. We show, contrary to a widelyheld belief that discriminative classiers are almost always to be preferred, that there can often be two distinct regimes of performance as the training set size is increased, one in which each algorithm does better. This stems from the observation|which is borne out in repeated experiments|that while discriminative learning has lower asymptotic error, a generative classier may also approach its (higher) asymptotic error much faster. 1

Keywords

Discriminative modelArtificial intelligenceNaive Bayes classifierLogistic regressionGenerative grammarClassifier (UML)Pattern recognition (psychology)Bayes error rateComputer scienceMachine learningBayes' theoremGenerative modelProbabilistic classificationRegressionMathematicsBayes classifierStatisticsSupport vector machineBayesian probability

Affiliated Institutions

Related Publications

Publication Info

Year
2001
Type
article
Volume
14
Pages
841-848
Citations
1881
Access
Closed

External Links

Citation Metrics

1881
OpenAlex

Cite This

Andrew Y. Ng, Michael I. Jordan (2001). On Discriminative vs. Generative Classifiers: A comparison of logistic regression and naive Bayes. , 14 , 841-848.