Infinite Mixture of Global Gaussian Processes
18 June 2015
In this paper, we propose a simple and powerful approach to solve nonlinear regression problems using an infinite mixture of global Gaussian processes (IMoGGP). Our method is able to deal with arbitrary output distributions, nonstationary signals, heteroscedastic noise and multimodal predictive distributions straightforwardly, without the modeler needing to know these attributes a priori. The IMoGGP can be interpreted as a mixture of experts, in which the experts are not local and they cooperate in the whole input space to provide accurate regression estimates. It can also be framed as a Dependent Dirichlet Process to solve discriminative tasks. Simulations show that our method gives comparative results to state-of-the-art approaches and its simplicity makes it an attractive method for non-ML-expert practitioners that do not want to rely on many different models to test which one fits their data best.