Abstract

Many intelligent systems are designed to sift through a mass of evidence and arrive at a decision. Certain pieces of evidence may be given more weight than others, and this may affect the final decision significantly. When more than one intelligent agent is available to make a decision, we can form a committee of experts. By combining the different opinions of these experts, the committee approach can sometimes outperform any individual expert. In this paper, we show how to exploit randomized learning algorithms in order to develop committees of experts. By using the majority vote of these experts to make decisions, we are able to improve the performance of the original learning algorithm. More precisely, we have developed a randomized decision tree induction algorithm, which generates different decision trees every time it is run. Each tree represents a different expert decision-maker. We combine these trees using a majority voting scheme in order to overcome small errors that appear in individual trees. We have tested our idea with several real data sets, and found that accuracy consistently improved when compared to the decision made by a single expert. We have developed some analytical results that explain why this effect occurs. Our experiments also show that the majority voting technique outperforms at least some alternative strategies for exploiting randomization.

Keywords

Decision treeVotingArtificial intelligenceMachine learningComputer scienceExploitMajority ruleData miningPolitical science

Affiliated Institutions

Related Publications

Boosting Decision Trees

We introduce a constructive, incremental learning system for regression problems that models data by means of locally linear experts. In contrast to other approaches, the expert...

1995 Neural Information Processing Systems 231 citations

Best-first Decision Tree Learning

Decision trees are potentially powerful predictors and explicitly represent the structure of a dataset. Standard decision tree learners such as C4.5 expand nodes in depth-first ...

2007 Research Commons (University of Waikato) 229 citations

Bagging, boosting, and C4.S

Breiman's bagging and Freund and Schapire's boosting are recent methods for improving the predictive power of classifier learning systems. Both form a set of classifiers that ar...

1996 National Conference on Artificial Int... 1262 citations

Publication Info

Year
1996
Type
book-chapter
Pages
305-317
Citations
15
Access
Closed

External Links

Social Impact

Altmetric

Social media, news, blog, policy document mentions

Citation Metrics

15
OpenAlex

Cite This

David Heath, Simon Kasif, Steven L. Salzberg (1996). Chapter 18 Committees of decision trees. Advances in psychology , 305-317. https://doi.org/10.1016/s0166-4115(96)80038-0

Identifiers

DOI
10.1016/s0166-4115(96)80038-0