Abstract

The problem of regression under Gaussian assumptions is treated generally. The relationship between Bayesian prediction, regularization and smoothing is elucidated. The ideal regression is the posterior mean and its computation scales as <span class='mathrm'>O(n<sup>3</sup>)</span>, where <span class='mathrm'>n</span> is the sample size. We show that the optimal <span class='mathrm'>m</span>-dimensional linear model under a given prior is spanned by the first <span class='mathrm'>m</span> eigenfunctions of a covariance operator, which is a trace-class operator. This is an infinite dimensional analogue of principal component analysis. The importance of Hilbert space methods to practical statistics is also discussed.

Keywords

MathematicsBayesian multivariate linear regressionApplied mathematicsPrincipal component regressionSmoothingProper linear modelCovariance operatorLinear regressionGaussianHilbert spaceLinear mapRegularization (linguistics)CovarianceLinear modelStatisticsMathematical analysisComputer scienceArtificial intelligencePure mathematics

Affiliated Institutions

Related Publications

Publication Info

Year
1997
Type
article
Pages
167-184
Citations
84
Access
Closed

External Links

Citation Metrics

84
OpenAlex

Cite This

Huaiyu Zhu, Christopher K. I. Williams, Richard Rohwer et al. (1997). Gaussian regression and optimal finite dimensional linear models. Aston Publications Explorer (Aston University) , 167-184.