Abstract
We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, two-class logistic regression, and multinomial regression problems while the penalties include ℓ(1) (the lasso), ℓ(2) (ridge regression) and mixtures of the two (the elastic net). The algorithms use cyclical coordinate descent, computed along a regularization path. The methods can handle large problems and can also deal efficiently with sparse features. In comparative timings we find that the new algorithms are considerably faster than competing methods.
Keywords
Affiliated Institutions
Related Publications
Regularization Paths for Generalized Linear Models via Coordinate Descent
We develop fast algorithms for estimation of generalized linear models with convex penalties. The models include linear regression, two-class logistic regression, and multi- nom...
Statistical Learning with Sparsity
Discover New Methods for Dealing with High-Dimensional Data A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to ...
Pathwise coordinate optimization
We consider “one-at-a-time” coordinate-wise descent algorithms for\na class of convex optimization problems. An algorithm of this\nkind has been proposed for the L<sub>1&l...
Sparse Principal Component Analysis
Principal component analysis (PCA) is widely used in data processing and dimensionality reduction. However, PCA suffers from the fact that each principal component is a linear c...
Regression Shrinkage and Selection Via the Lasso
SUMMARY We propose a new method for estimation in linear models. The ‘lasso’ minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients b...
Publication Info
- Year
- 2010
- Type
- article
- Volume
- 33
- Issue
- 1
- Pages
- 1-22
- Citations
- 13975
- Access
- Closed