Andreas Geiger has written a simple Gaussian process regression Java applet, illustrating the behaviour of covariance functions and hyperparameters.
package title author implementation description
bcm The Bayesian Committee Machine Anton Schwaighofer matlab and NETLAB
An extension of the Netlab implementation for GP regression. It allows large scale regression based on the BCM approximation, see also the accompanying paper
fbm Software for Flexible Bayesian Modeling Radford M. Neal C for linux/unix
An extensive and well documented package implementing Markov chain Monte Carlo methods for Bayesian inference in neural networks, Gaussian processes (regression, binary and multi-class classification), mixture models and Dirichlet Diffusion trees.
gp-lvm and fgp-lvm A (fast) implementation of Gaussian Process Latent Variable Models Neil D. Lawrence matlab and C
gpml Code from the Rasmussen and Williams: Gaussian Processes for Machine Learning book. Carl Edward Rasmussen and Hannes Nickisch matlab and octave
The GPML toolbox implements approximate inference algorithms for Gaussian processes such as Expectation Propagation, the Laplace Approximation and Variational Bayes for a wide class of likelihood functions for both regression and classification. It comes with a big algebra of covariance and mean functions allowing for flexible modeling. The code is fully compatible to Octave 3.2.x. JMLR paper describing the toolbox.
c++-ivm Sparse approximations based on the Informative Vector Machine Neil D. Lawrence C++
IVM Software in C++ , also includes the null category noise model for semi-supervised learning.
BFD Bayesian Fisher’s Discriminant software Tonatiuh Peña Centeno matlab
Implements a Gaussian process interpretation of Kernel Fisher’s discriminant.
gpor Gaussian Processes for Ordinal Regression Wei Chu C for linux/unix
Software implementation of Gaussian Processes for Ordinal Regression. Provides Laplace Approximation, Expectation Propagation and Variational Lower Bound.
MCMCstuff MCMC Methods for MLP and GP and Stuff Aki Vehtari matlab and C
A collection of matlab functions for Bayesian inference with Markov chain Monte Carlo (MCMC) methods. The purpose of this toolbox was to port some of the features in fbm to matlab for easier development for matlab users.
ogp Sparse Online Gaussian Processes Lehel Csató matlab and NETLAB
Approximate online learning in sparse Gaussian process models for regression (including several non-Gaussian likelihood functions) and classification.
sogp Sparse Online Gaussian Process C++ Library Dan Grollman C++
Sparse online Gaussian process C++ library based on the PhD thesis of Lehel Csató
spgp .tgz or .zip Sparse Pseudo-input Gaussian Processes Ed Snelson matlab
Implements sparse GP regression as described in Sparse Gaussian Processes using Pseudo-inputs and Flexible and efficient Gaussian process models for machine learning. The SPGP uses gradient-based marginal likelihood optimization to find suitable basis points and kernel hyperparameters in a single joint optimization.
tgp Treed Gaussian Processes Robert B. Gramacy C/C++ for R
Bayesian Nonparametric and nonstationary regression by treed Gaussian processes with jumps to the limiting linear model (LLM). Special cases also implememted include Bayesian linear models, linear CART, stationary separable and isotropic Gaussian process regression. Includes 1-d and 2-d plotting functions (with higher dimension projection and slice capabilities), and tree drawing, designed for visualization of tgp class output. See also Gramacy 2007
Tpros Gaussian Process Regression David MacKay and Mark Gibbs C
Tpros is the Gaussian Process program written by Mark Gibbs and David MacKay.
GP Demo Octave demonstration of Gaussian process interpolation David MacKay octave
This DEMO works fine with octave-2.0 and did not work with 2.1.33.
GPClass Matlab code for Gaussian Process Classification David Barber and C. K. I. Williams matlab
Implements Laplace’s approximation as described in Bayesian Classification with Gaussian Processes for binary and multiclass classification.
VBGP Variational Bayesian Multinomial Probit Regression with Gaussian Process Priors Mark Girolami and Simon Rogers matlab
Implements a variational approximation for Gaussian Process based multiclass classification as described in the paper Variational Bayesian Multinomial Probit Regression.
pyGPs Gaussian Processes for Regression and Classification Marion Neumann Python
pyGPs is a library containing an object-oriented python implementation for Gaussian Process (GP) regression and classification. github
gaussian-process Gaussian process regression Anand Patil Python
under development
gptk Gaussian Process Tool-Kit Alfredo Kalaitzis R
The gptk package implements a general-purpose toolkit for Gaussian process regression with an RBF covariance function. Based on a MATLAB implementation written by Neil D. Lawrence.
Other software that way be useful for implementing Gaussian process models:
This internet site aims to supply an summary of resources concerned with probabilistic modeling, inference and learning supported Gaussian processes. Although Gaussian processes have an extended history within the field of statistics, they appear to possess been employed extensively only in niche areas. With the arrival of kernel machines within the machine learning community, models supported Gaussian processes became commonplace for problems of regression (kriging) and classification also as a number of more specialized applications.
Tutorials
Several papers provide tutorial material suitable for a primary introduction to learning in Gaussian process models. These range from very short [Williams 2002] over intermediate [MacKay 1998], [Williams 1999] to the more elaborate [Rasmussen and Williams 2006]. All of those require only a minimum of prerequisites within the sort of elementary applied mathematics and algebra .
Regression
The simplest uses of Gaussian process models are for (the conjugate case of) regression with Gaussian noise. See the approximation section for papers which deal specifically with sparse or fast approximation techniques. O’Hagan 1978 represents an early reference from the statistics comunity for the utilization of a Gaussian process as a previous over functions, a thought which was only introduced to the machine learning community by Williams and Rasmussen 1996.
Classification
Exact inference in Gaussian process models for classification isn’t tractable. Several approximation schemes are suggested, including Laplace’s method, variational approximations, mean field methods, Markov chain Monte Carlo and Expectation Propagation. cf. the approximation section. Multi-class classification could also be treated explicitly, or decomposed into multiple, binary (one against the rest) problems. For introductions, see for instance Williams and Barber 1998 or Kuss and Rasmussen 2005. Bounds from the PAC-Bayesian perspective are applied in Seeger 2002.
Covariance Functions and Properties of Gaussian Processes
The properties of Gaussian processes are controlled by the (mean function and) covariance function. Some references here describe difference covariance functions, while others give mathematical characterizations, see eg. Abrahamsen 1997 for a review. Some references describe non-standard covariance functions resulting in non-stationarity etc.
Model Selection
Approximations
There are two main reasons for doing approximations in Gaussian process models. Either due to analytical intractability like arrises in classification and regression with non-Gaussian noise. Or so as to realize a computational advantage when using large datasets, by the utilization of sparse approximations. Some methods address both issues simultaneously. The approximation methods and approximate inference algorithms are quite diverse, see Quiñonero-Candela and Ramussen 2005 for a unifying framework for sparse approximations within the Gaussian regression model.
References from the Statistics Community
Gaussian processes have an extended history within the statistics community. they need been particularly well developed in geostatistics under the name of kriging. The papers are grouped because they’re written employing a common terminology, and have slightly different focus from typical machine learning papers,
Consistency, Learning Curves and Bounds
The papers during this section give theoretical results on learning curves, which describe the expected generalization performance as a function of the amount of coaching cases. Consistency addresses the question whether the answer approaches truth data generating process within the limit of infinitely many training examples.