Contents
List of Figures
Declaration
Introduction
Bayesian learning
Gaussian Processes
Feature spaces
Sparsity
Structure of the thesis
Notations
Gaussian Process Representation and Online Learning
Generalised linear models
Bayesian Learning for Gaussian Processes
Exact results for regression
Approximations for general models
Parametrisation of the posterior moments
Parametrisation in the feature space
Online learning for Gaussian processes
The online learning algorithm
Discussion
Sparsity in Gaussian Processes
Redundancy in the representation
Computing the KL-distances
KL-optimal projection
Measuring the error
Sparse online updates
A sparse GP algorithm
Using a predefined
set
Comparison with other sparse kernel techniques
Dimensionality reduction using eigen-decompositions
Subspace methods
Pursuit algorithms
Discussion
Further research directions
Sparsity and the Expectation-Propagation Algorithm
Expectation-Propagation
EP for Gaussian Processes
Relation between GP parametrisations
Sparsity and Expectation Propagation
Comparisons for regression
The proposed algorithm
Discussion and Further Research
Applications
Regression
Classification
Density Estimation
Estimating wind-fields from scatterometer data
Processing Scatterometer Data
Learning vector Gaussian processes
Measuring the Relative Weight of the Approximation
Sparsity for vectorial GPs
Summary
Conclusions and Further Research
Further Research Directions
Matrix inversion formulae
Properties of zero-mean Gaussians
Iterative computation of the inverse Gram matrix
Computing determinants
Updates for the Cholesky factorisation
KL-optimal parameter reduction
Computing the KL-distance
Updates for
S
t + 1
= (
C
t + 1
-1
+
K
t + 1
)
-1
Diagonalisation of matrix
C
Updates for the wind fields
The Sparse EP algorithm
Bibliography
L CSATO 2003-04-23