12/07/2020

Associative Memory in Iterated Overparameterized Sigmoid Autoencoders

Yibo Jiang, Cengiz Pehlevan

Keywords: Deep Learning - Generative Models and Autoencoders

Abstract: Recent work suggests that overparameterized autoencoders can be trained to implement associative memory via iterative maps. This phenomenon happens when converged input-output Jacobian of the network has all eigenvalue norms strictly below one. In this work, we theoretically analyze this behavior for sigmoid networks by leveraging recent developments in deep learning theories, especially the Neural Tangent Kernel (NTK) theory. We find that overparameterized sigmoid autoencoders can have attractors in the NTK limit for both training with a single example and multiple examples under certain conditions. In particular, for multiple training examples, we find that the norm of the largest Jacobian eigenvalue drops below one with increasing input norm, leading to associative memory.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ICML 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers