22/11/2021

Adaptive Tensor Networks Decomposition

Chang Nie, Huan Wang, Le Tian

Keywords: Tensor networks, Adaptive decomposition, Tensor completion, Neural network compression.

Abstract: Tensor Decomposition (TD) is a powerful technique in solving high-dimensional optimization problems and has been widely used in machine learning and data science. Many TD models aim to establish a trade-off between computational complexity and representation ability. However, they have the problem of tensor rank selection and latent factor arrangement, and the neglected internal correlation between different modes. In this paper, we propose a data-adaptive TD model established on a generalized tensor rank and name it adaptive tensor network (ATN) decomposition, which constructs an optimal topological structure for TD according to the intrinsic properties of the data. We exploit the generalized tensor rank to measure the correlation between two modes of the data and establish a multi-linear connection between the corresponding latent factors with an adaptive rank. ATN possesses the merits of permutation invariance, strong robustness, and represents high-order data with fewer parameters. We verified ATN's effectiveness and superiority on three typical tasks: tensor completion, image denoising, and neural network compression. Experimental results on synthetic data and real datasets demonstrate that the overall performance of ATN surpasses the state-of-the-art TD method.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at BMVC 2021 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers