Abstract
We propose a new method for model selection and model fitting in multivariate nonparametric regression models, in the framework of smoothing spline ANOVA. The “COSSO ” is a method of regularization with the penalty functional being the sum of component norms, instead of the squared norm employed in the traditional smoothing spline method. The COSSO provides a unified framework for several recent proposals for model selection in linear models and smoothing spline ANOVA models. Theoretical properties, such as the existence and the rate of convergence of the COSSO estimator, are studied. In the special case of a tensor product design with periodic functions, a detailed analysis reveals that the COSSO does model selection by applying a novel soft thresholding type operation to the function components. We give an equivalent formulation of the COSSO estimator which leads naturally to an iterative algorithm. We compare the COSSO with MARS, a popular method that builds functional ANOVA models, in simulations and real examples. The COSSO method can be extended to classification problems and we compare its performance with those of a number of machine learning algorithms on real datasets. The COSSO gives very competitive performance in these studies. 1. Introduction. Consider
Keywords
Affiliated Institutions
Related Publications
Linear Smoothers and Additive Models
We study linear smoothers and their use in building nonparametric regression models. In the first part of this paper we examine certain aspects of linear smoothers for scatterpl...
Spline Smoothing in Regression Models and Asymptotic Efficiency in $L_2$
For nonparametric regression estimation on a bounded interval, optimal rates of decrease for integrated mean square error are known but not the best possible constants. A sharp ...
Flexible regression models with cubic splines
Abstract We describe the use of cubic splines in regression models to represent the relationship between the response variable and a vector of covariates. This simple method can...
Theory for penalised spline regression
Penalised spline regression is a popular new approach to smoothing, but its theoretical properties are not yet well understood. In this paper, mean squared error expressions and...
Empirical Functionals and Efficient Smoothing Parameter Selection
SUMMARY A striking feature of curve estimation is that the smoothing parameter ĥ 0, which minimizes the squared error of a kernel or smoothing spline estimator, is very difficul...
Publication Info
- Year
- 2006
- Type
- article
- Volume
- 34
- Issue
- 5
- Citations
- 563
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1214/009053606000000722