19/10/2020

Robust normalized squares maximization for unsupervised domain adaptation

Wenju Zhang, Xiang Zhang, Qing Liao, Wenjing Yang, Long Lan, Zhigang Luo

Keywords: transfer learning, image classification, domain adaptation

Abstract: Unsupervised domain adaptation (UDA) attempts to transfer specific knowledge from one domain with labeled data to another domain without labels. Recently, maximum squares loss has been proposed to tackle UDA problem but it does not consider the prediction diversity which has proven beneficial to UDA. In this paper, we propose a novel normalized squares maximization (NSM) loss in which the maximum squares is normalized by the sum of squares of class sizes. The normalization term enforces the class sizes of predictions to be balanced to explicitly increase the diversity. Theoretical analysis shows that the optimal solution to NSM is one-hot vectors with balanced class sizes, i.e., NSM encourages both discriminate and diverse predictions. We further propose a robust variant of NSM, RNSM, by replacing the square loss with L2,1-norm to reduce the influence of outliers and noises. Experiments of cross-domain image classification on two benchmark datasets illustrate the effectiveness of both NSM and RNSM. RNSM achieves promising performance compared to state-of-the-art methods. The code is available at https://github.com/wj-zhang/NSM.

The video of this talk cannot be embedded. You can watch it here:
https://dl.acm.org/doi/10.1145/3340531.3412083#sec-supp
(Link will open in new window)
 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at CIKM 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers