26/08/2020

A Linear-time Independence Criterion Based on a Finite Basis Approximation

Longfei Yan, W. Bastiaan Kleijn, thushara abhayapala

Keywords:

Abstract: Detection of statistical dependence between random variables is an essential component in many machine learning algorithms. We propose a novel independence criterion for two random variables with linear-time complexity. We establish that our independence criterion is an upper bound of the Hirschfeld-Gebelein-Rényi maximum correlation coefficient between tested variables. A finite set of basis functions is employed to approximate the mapping functions that can achieve the maximal correlation. Using classic benchmark experiments based on independent component analysis, we demonstrate that our independence criterion performs comparably with the state-of-the-art quadratic-time kernel dependence measures like the Hilbert-Schmidt Independence Criterion, while being more efficient in computation. The experimental results also show that our independence criterion outperforms another contemporary linear-time kernel dependence measure, the Finite Set Independence Criterion. The potential application of our criterion in deep neural networks is validated experimentally.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at AISTATS 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers