Regression Shrinkage and Selection Via the Lasso

1996 Journal of the Royal Statistical Society Series B (Statistical Methodology) 49,419 citations

Abstract

SUMMARY We propose a new method for estimation in linear models. The ‘lasso’ minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant. Because of the nature of this constraint it tends to produce some coefficients that are exactly 0 and hence gives interpretable models. Our simulation studies suggest that the lasso enjoys some of the favourable properties of both subset selection and ridge regression. It produces interpretable models like subset selection and exhibits the stability of ridge regression. There is also an interesting relationship with recent work in adaptive function estimation by Donoho and Johnstone. The lasso idea is quite general and can be applied in a variety of statistical models: extensions to generalized regression models and tree-based models are briefly described.

Keywords

ShrinkageLasso (programming language)Selection (genetic algorithm)RegressionComputer scienceStatisticsArtificial intelligenceMathematics

Affiliated Institutions

Related Publications

Centroids

Abstract The concept of centroid is the multivariate equivalent of the mean. Just like the mean, the centroid of a cloud of points minimizes the sum of the squared distances fro...

2009 Wiley Interdisciplinary Reviews Compu... 11 citations

Publication Info

Year
1996
Type
article
Volume
58
Issue
1
Pages
267-288
Citations
49419
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

49419
OpenAlex

Cite This

Robert Tibshirani (1996). Regression Shrinkage and Selection Via the Lasso. Journal of the Royal Statistical Society Series B (Statistical Methodology) , 58 (1) , 267-288. https://doi.org/10.1111/j.2517-6161.1996.tb02080.x

Identifiers

DOI
10.1111/j.2517-6161.1996.tb02080.x