Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp0147429c76s
Title: Large Linear Multi-output Gaussian Process Learning
Authors: Feinberg, Vladimir
Advisors: Li, Kai
Department: Computer Science
Certificate Program: Center for Statistics and Machine Learning
Class Year: 2017
Abstract: Gaussian process (GP) models, which put a distribution over arbitrary functions in a continuous domain, can be generalized to the multi-output case; a common way of doing this is to use a linear model of coregionalization. Such models can learn correlations across the multiple outputs, which can then be exploited to share knowledge across the multiple outputs. For instance, temperature data from disparate regions over time can contribute to a predictive weather model that is more accurate than the same model applied to a single region. While model learning can be performed efficiently for single-output GPs, the multi-output case still requires approximations for large numbers of observations across all outputs. In this work, we propose a new method, Large Linear GP (LLGP), which estimates covariance hyperparameters for multi-dimensional outputs and one-dimensional inputs. Our approach learns GP kernel hyperparameters at an asymptotically faster rate than the current state of the art. When applied to real time series data, we find this theoretical improvement is realized with LLGP being generally an order of magnitude faster while improving or maintaining predictive accuracy. Finally, we discuss extensions of our approach to multidimensional inputs.
URI: http://arks.princeton.edu/ark:/88435/dsp0147429c76s
Type of Material: Princeton University Senior Theses
Language: en_US
Appears in Collections:Computer Science, 1987-2023

Files in This Item:
File SizeFormat 
feinberg_vladimir.pdf875.83 kBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.