Readspeaker menu

Divergence-Based One-Class Classification Using Gaussian Processes

Paul Bodesheim, Erik Rodner, Alexander Freytag, and Joachim Denzler


Summary

We present an information theoretic framework for one-class classification, which allows for deriving several new novelty scores. With these scores, we are able to rank samples according to their novelty and to detect outliers not belonging to a learnt data distribution. The key idea of our approach is to measure the impact of a test sample on the previously learnt model.  This is carried out in a probabilistic manner using Jensen-Shannon divergence and reclassification results derived from Gaussian process classification. Our method is evaluated using well-known machine learning datasets as well as large-scale image categorisation experiments showing their ability to achieve state-of-the-art performance.


General Approach: Overview

overviewDivergenBasedOCC                 


    divergenceOCC_decisionFunctions



Publications

 

[Bodesheim12:DOC]

Paul Bodesheim and Erik Rodner and Alexander Lütz and Joachim Denzler. Divergence-Based One-Class Classification Using Gaussian Processes. British Machine Vision Conference (BMVC). 2012. pp. 50.1--50.11. [bib] [pdf]