12/07/2020

Distance Metric Learning with Joint Representation Diversification

Xu Chu, Yang Lin, Xiting Wang, Xin Gao, Qi Tong, Hailong Yu, Yasha Wang

Keywords: Applications - Computer Vision

Abstract: Distance metric learning (DML) is to learn a representation space equipped with a metric, such that examples from the same class are closer than examples from different classes with respect to the metric. The recent success of deep neural networks motivates many DML losses that encourage the intra-class compactness and inter-class separability. However, overemphasizing intra-class compactness may potentially cause the neural networks to filter out information that contributes to discriminating examples from unseen classes, resulting in a less generalizable representation. In contrast, we propose not to penalize intra-class distances explicitly and use a Joint Representation Similarity (JRS) regularizer that focuses on penalizing inter-class distributional similarities in a DML framework. The proposed JRS regularizer diversifies the joint distributions of representations from different classes in multiple neural layers based on cross-covariance operators in Reproducing Kernel Hilbert Space (RKHS). Experiments on three well-known benchmark datasets (Cub-200-2011, Cars-196, and Stanford Online Products) demonstrate the effectiveness of the proposed approach.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ICML 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers