22/11/2021

Training Better Deep Neural Networks with Uncertainty Mining Net

Yang Sun, Abhishek Kolagunda, Steven Eliuk, Xiaolong Wang

Keywords: label noise, label uncertainty, learning with noisy labels

Abstract: In this work, we consider the problem of training deep neural networks on partially labeled data with label noise. That is, semi-supervised training of deep neural networks with noisily labeled data. As far as we know, this is a scarcely studied topic. We present a novel end-to-end deep generative framework for improving classifier performance when dealing with such data challenges. We call it Uncertainty Mining Net (UMN). We utilize all the available data (labeled and unlabeled) to train the classifier via a semi-supervised generative framework. During training, UMN estimates the uncertainty of the labels to focus on clean data for learning. More precisely, UMN applies a novel sample-wise label uncertainty estimation scheme. Extensive experiments and comparisons against state-of-the-art methods on several popular benchmark datasets demonstrate that UMN can reduce the impact of label noise and significantly improve classifier performance.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at BMVC 2021 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers