Conceptron: a probabilistic deep one-class classification method

29 Sep 2021  ·  Erika Gardini, Andrea Cavalli, Sergio Decherchi ·

One-class learning through deep architectures is a particularly challenging task; in this scenario the crasis of kernel methods and deep networks can represent a viable strategy to empower already effective methods. In this contribution we present Conceptron, a probabilistic and deep one-class classification method. The proposed algorithm is a hybridization of the Nystrom version of the Import Vector Domain Description (IVDD) to deep learning layers rendering the approach highly scalable (via batch stochastic gradient optimization) and automatically learning the underlying feature space. Further we modify the cost function to allow to get a Laplace distribution of the samples probabilities. Experiments on MNIST, CIFAR-10 and other benchmark datasets show that Conceptron (and/or variations) performs comparably or better with competing state-of-the-art methods with the additional capability of providing probabilities (through a logistic model) and avoiding any degeneracy in the training process.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here