02/02/2021

Tackling Instance-Dependent Label Noise via a Universal Probabilistic Model

Qizhou Wang, Bo Han, Tongliang Liu, Gang Niu, Jian Yang, Chen Gong

Keywords:

Abstract: The drastic increase of data quantity often brings the severe decrease of data quality, such as incorrect label annotations. It poses a great challenge for robustly training Deep Neural Networks (DNNs). Existing learning methods with label noise either employ ad-hoc heuristics or restrict to specific noise assumptions. However, more general situations, such as instance-dependent label noise, have not been fully explored, as scarce studies focus on their label corruption process. By categorizing instances into confusing and unconfusing instances, this paper proposes a simple yet universal probabilistic model, which explicitly relates noisy labels to their instances. The resultant model can be realized by DNNs, where the training procedure is accomplished by employing a novel alternating optimization algorithm. Experiments on datasets with both synthetic and real-world label noise verify the proposed method yields significant improvements on robustness over state-of-the-art counterparts.

The video of this talk cannot be embedded. You can watch it here:
https://slideslive.com/38948129
(Link will open in new window)
 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at AAAI 2021 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers