Abstract:
Annotating a qualitative large-scale facial expression dataset is extremely difficult due to the uncertainties caused by ambiguous facial expressions, low-quality facial images, and the subjectiveness of annotators. These uncertainties suspend the progress of large-scale Facial Expression Recognition (FER) in data-driven deep learning era. To address this problelm, this paper proposes to suppress the uncertainties by a simple yet efficient Self-Cure Network (SCN). Specifically, SCN suppresses the uncertainty from two different aspects: 1) a self-attention mechanism over FER dataset to weight each sample in training with a ranking regularization, and 2) a careful relabeling mechanism to modify the labels of these samples in the lowest-ranked group. Experiments on synthetic FER datasets and our collected WebEmotion dataset validate the effectiveness of our method. Results on public benchmarks demonstrate that our SCN outperforms current state-of-the-art methods with \textbf{88.14}\% on RAF-DB, \textbf{60.23}\% on AffectNet, and \textbf{89.35}\% on FERPlus.