Model Selection and Akaike's Information Criterion (AIC): The General Theory and its Analytical Extensions

1987 Psychometrika 4,411 citations

Abstract

During the last fifteen years, Akaike's entropy-based Information Criterion (AIC) has had a fundamental impact in statistical model evaluation problems. This paper studies the general theory of the AIC procedure and provides its analytical extensions in two ways without violating Akaike's main principles. These extensions make AIC asymptotically consistent and penalize overparameterization more stringently to pick only the simplest of the “true” models. These selection criteria are called CAIC and CAICF. Asymptotic properties of AIC and its extensions are investigated, and empirical performances of these criteria are studied in choosing the correct degree of a polynomial model in two different Monte Carlo experiments under different conditions.

Keywords

Akaike information criterionMathematicsModel selectionEntropy (arrow of time)Applied mathematicsSelection (genetic algorithm)Bayesian information criterionInformation CriteriaStatisticsMonte Carlo methodInformation theoryEconometricsComputer scienceArtificial intelligence

Affiliated Institutions

Related Publications

Factor Analysis and AIC

The information criterion AIC was introduced to extend the method of maximum likelihood to the multimodel situation. It was obtained by relating the successful experience of the...

1987 Psychometrika 4988 citations

Publication Info

Year
1987
Type
article
Volume
52
Issue
3
Pages
345-370
Citations
4411
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

4411
OpenAlex

Cite This

Hamparsum Bozdogan (1987). Model Selection and Akaike's Information Criterion (AIC): The General Theory and its Analytical Extensions. Psychometrika , 52 (3) , 345-370. https://doi.org/10.1007/bf02294361

Identifiers

DOI
10.1007/bf02294361