Important: some text is copy-pasted from the thesis (see References). For more compact information you can look at the Neural Computation article.

Examples of Online Gaussian Process Inference

Online inference is applicable whenever one can compute the local posterior mean and (co)variance. The term local refers to the current example and a previous approximation to the posterior process.
An equivalent formulation is to obtain analytically the averaged likelihod for the current input. The update coefficients are obtained by differentiating its logarithm.
It is important that the average needed for the posterior process to be implemented is one-dimensional (or two, but definitely does not scale with the number of examples as with the normal use of GP inference).
In all situations below we compute the scalar coefficients based on the following relation:
q_t+1 and r_t+1 (53)
The brackects within the logarithm denote averaging with respect to a Gaussian ft+1-- average for which there often exist good approximations (lookup-tables, or efficient numerical solutions).
In the examples that follow different likelihoods are presented. It is emphasised that the only difference is the likelihood, since the desscription of the estimation procedure will not be repeated.

Questions, comments, suggestions: contact Lehel Csató.