Paper

RBUE: A ReLU-Based Uncertainty Estimation Method of Deep Neural Networks

Deep neural networks (DNNs) have successfully learned useful data representations in various tasks. However, assessing the reliability of these representations remains a challenge. Deep Ensemble is widely considered the state-of-the-art method which can estimate the uncertainty with higher quality, but it is very expensive to train and test. MC-Dropout is another popular method, which is less expensive but lacks the diversity of predictions. To estimate the uncertainty with higher quality in less time, we introduce a ReLU-Based Uncertainty Estimation (RBUE) method. Instead of randomly dropping some neurons of the network as in MC-Dropout or using the randomness of the initial weights of networks as in Deep Ensemble, RBUE adds randomness to the activation function module, making the outputs diverse. Under the method, we propose two strategies, MC-DropReLU and MC-RReLU, to estimate uncertainty. We analyze and compare the output diversity of MC-Dropout and our method from the variance perspective and obtain the relationship between the hyperparameters and predictive diversity in the two methods. Moreover, our method is simple to implement and does not need to modify the existing model. We experimentally validate the RBUE on three widely used datasets, CIFAR10, CIFAR100, and TinyImageNet. The experiments demonstrate that our method has competitive performance but is more favorable in training time and memory requirements.

Results in Papers With Code
(↓ scroll down to see all results)