Abstract
Sparsity or parsimony of statistical models is crucial for their proper interpretations, as in sciences and social sciences. Model selection is a commonly used method to find such models, but usually involves a computationally heavy combinatorial search. Lasso (Tibshirani, 1996) is now being used as a computationally feasible alternative to model selection.
Keywords
Affiliated Institutions
Related Publications
Regularization and Variable Selection Via the Elastic Net
Summary We propose the elastic net, a new regularization and variable selection method. Real world data and a simulation study show that the elastic net often outperforms the la...
Statistical Learning with Sparsity
Discover New Methods for Dealing with High-Dimensional Data A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to ...
Regularization Paths for Generalized Linear Models via Coordinate Descent.
We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, two-class logistic regression, and multinomia...
Regularization Paths for Generalized Linear Models via Coordinate Descent
We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, two-class logistic regression, and multi- nom...
Least angle regression
The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to whi...
Publication Info
- Year
- 2006
- Type
- article
- Volume
- 7
- Issue
- 90
- Pages
- 2541-2563
- Citations
- 1986
- Access
- Closed