Abstract
Gaussian processes are attractive models for probabilistic classification but unfortunately exact inference is analytically intractable. We compare Laplace‘s method and Expectation Propagation (EP) focusing on marginal likelihood estimates and predictive performance. We explain theoretically and corroborate empirically that EP is superior to Laplace. We also compare to a sophisticated MCMC scheme and show that EP is surprisingly accurate.
Keywords
Affiliated Institutions
Related Publications
Gaussian Processes for Machine Learning
We give a basic introduction to Gaussian Process regression models. We focus on understanding the role of the stochastic process and how it is used to define a distribution over...
Expectation Propagation for Exponential Families
This is a tutorial describing the Expectation Propagation (EP) algorithm for a general exponential family. Our focus is on simplicity of exposition. Although the overhead of tra...
Finite-Dimensional Approximation of Gaussian Processes
Gaussian process (GP) prediction suffers from O(n3) scaling with the data set size n. By using a finite-dimensional basis to approximate the GP predictor, the computational comp...
A robust <i>P</i>‐value for treatment effect in meta‐analysis with publication bias
Abstract Publication bias is a major and intractable problem in meta‐analysis. There have been several attempts in the literature to adapt methods to allow for such bias, but th...
Gaussian Process Priors with Uncertain Inputs Application to Multiple-Step Ahead Time Series Forecasting
We consider the problem of multi-step ahead prediction in time series analysis using the non-parametric Gaussian process model. k-step ahead forecasting of a discrete-time non-l...
Publication Info
- Year
- 2005
- Type
- article
- Volume
- 18
- Pages
- 699-706
- Citations
- 25
- Access
- Closed