Readspeaker Menü

Large-Scale Gaussian Process Classification with Flexible Adaptive Histogram Kernels

Erik Rodner, Alexander Freytag, Paul Bodesheim, and Joachim Denzler

 

[Rodner12:LGP]

Erik Rodner and Alexander Freytag and Paul Bodesheim and Joachim Denzler. Large-Scale Gaussian Process Classification with Flexible Adaptive Histogram Kernels. European Conference on Computer Vision (ECCV).2012.85--98. [pdf] [bib] [supplementary material] [poster]
Corresponding project: Large-scale Gaussian Processes with Flexible Adaptive Histogram Kernels


Problem Statement

Gaussian Processes suffer from several drawbacks in the general formulation

  • training as well as hyperparameter optimization is cubically in the number of examples used
  • evaluation is linear in the number of examples used
  • computation of the classification uncertainty is quadratic in the number of examples used
  • the memory demand is quadratic due to the kernel matrix


Summary

We present how to perform exact large-scale multi-class Gaussian process classfiication with parameterized histogram intersection kernels. In contrast to previous approaches, we use a full Bayesian model without any sparse approximation techniques, which allows for learning in sub-quadratic and classification in constant time. To handle the additional model flexibility induced by parameterized kernels, our approach is able to optimize the parameters with large-scale training data. A key ingredient of this optimization is a new efficient upper bound of the negative Gaussian process log-likelihood. Experiments with image categorization tasks exhibit high performance gains with flexible kernels as well as learning within a few minutes and classification in microseconds for databases, where exact Gaussian process inference was not possible before.


General Approach for Classification and Hyperparameter Optimization: Overview

overviewHIKGP.png



Resulting Benefits (runtimes, memory demand)

Runtimes GP HIK



Downloads


Back to main project page.