22/11/2021

Feature Space Saturation during Training

Mats L Richter, Justin C Shenk, Wolf Byttner, Anna Wiedenroth, Mikael Huss

Keywords: deep learning, convolutional neural networks, PCA, XAI, explainable AI, neural architecture, classification

Abstract: We propose layer saturation - a simple, online-computable method for analyzing the information processing in neural networks. First, we show that a layer’s output can be restricted to an eigenspace of its covariance matrix without performance loss. We propose a computationally lightweight method that approximates the covariance matrix during training. From the dimension of its relevant eigenspace we derive layer saturation- the ratio between the eigenspace dimension and layer width. We show evidence that saturation indicates which layers contribute to network performance. We demonstrate how to alter layer saturation in a neural network by changing network depth, filter sizes and input resolution. Finally we show that pathological patterns of saturation are indicative of parameter inefficiencies caused by a mismatch between input resolution and neural architecture.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at BMVC 2021 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers