Abstract
Abstract This article describes an appropriate way of implementing the generalized cross-validation method and some other least-squares-based smoothing parameter selection methods in penalized likelihood regression problems, and explains the rationales behind it. Simulations of limited scale are conducted to back up the semitheoretical analysis.
Keywords
Affiliated Institutions
Related Publications
Penalized Regressions: The Bridge versus the Lasso
Abstract Bridge regression, a special family of penalized regressions of a penalty function Σ|βj|γ with γ ≤ 1, considered. A general approach to solve for the bridge estimator i...
Fast Stable Restricted Maximum Likelihood and Marginal Likelihood Estimation of Semiparametric Generalized Linear Models
Summary Recent work by Reiss and Ogden provides a theoretical basis for sometimes preferring restricted maximum likelihood (REML) to generalized cross-validation (GCV) for smoot...
Empirical Functionals and Efficient Smoothing Parameter Selection
SUMMARY A striking feature of curve estimation is that the smoothing parameter ĥ 0, which minimizes the squared error of a kernel or smoothing spline estimator, is very difficul...
The Performance of Cross-Validation and Maximum Likelihood Estimators of Spline Smoothing Parameters
Abstract An important aspect of nonparametric regression by spline smoothing is the estimation of the smoothing parameter. In this article we report on an extensive simulation s...
Model selection and estimation in the Gaussian graphical model
We propose penalized likelihood methods for estimating the concentration matrix in the Gaussian graphical model. The methods lead to a sparse and shrinkage estimator of the conc...
Publication Info
- Year
- 1992
- Type
- article
- Volume
- 1
- Issue
- 2
- Pages
- 169-179
- Citations
- 111
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1080/10618600.1992.10477012