12/07/2020

Disentangling Trainability and Generalization in Deep Neural Networks

Lechao Xiao, Jeffrey Pennington, Samuel Schoenholz

Keywords: Deep Learning - Theory

Abstract: A longstanding goal in the theory of deep learn-ing is to characterize the conditions under whicha given neural network architecture will be train-able, and if so, how well it might generalize tounseen data. In this work, we provide such a char-acterization in the limit of very wide and verydeep networks, for which the analysis simplifiesconsiderably. For wide networks, the trajectoryunder gradient descent is governed by the NeuralTangent Kernel (NTK), and for deep networks,the NTK itself maintains only weak data depen-dence. By analyzing the spectrum of the NTK,we formulate necessary conditions for trainabilityand generalization across a range of architectures,including Fully Connected Networks (FCNs) andConvolutional Neural Networks (CNNs). Weidentify large regions of hyperparameter spacefor which networks can memorize the training setbut completely fail to generalize. We find thatCNNs without global average pooling behave al-most identically to FCNs, but that CNNs withpooling have markedly different and often bettergeneralization performance. A thorough empiri-cal investigation of these theoretical results showsexcellent agreement on real datasets.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ICML 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers