Abstract
An asymptotically optimal selection of regression variables is proposed. The key assumption is that the number of control variables is infinite or increases with the sample size. It is also shown that Mallows's Cp', Akaike's FPE and aic methods are all asymptotically equivalent to this method.
Keywords
Affiliated Institutions
Related Publications
Adaptive Model Selection
AbstractMost model selection procedures use a fixed penalty penalizing an increase in the size of a model. These nonadaptive selection procedures perform well only in one type o...
Model Selection and Akaike's Information Criterion (AIC): The General Theory and its Analytical Extensions
During the last fifteen years, Akaike's entropy-based Information Criterion (AIC) has had a fundamental impact in statistical model evaluation problems. This paper studies the g...
Regression and time series model selection in small samples
A bias correction to the Akaike information criterion, AIC, is derived for regression and autoregressive time series models. The correction is of particular use when the sample ...
How Biased is the Apparent Error Rate of a Prediction Rule?
Abstract A regression model is fitted to an observed set of data. How accurate is the model for predicting future observations? The apparent error rate tends to underestimate th...
A NOTE ON THE LASSO AND RELATED PROCEDURES IN MODEL SELECTION
The Lasso, the Forward Stagewise regression and the Lars are closely re-lated procedures recently proposed for linear regression problems. Each of them can produce sparse models...
Publication Info
- Year
- 1981
- Type
- article
- Volume
- 68
- Issue
- 1
- Pages
- 45-54
- Citations
- 537
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1093/biomet/68.1.45