14/09/2020

Squeezing Correlated Neurons for Resource-Efficient Deep Neural Networks

Elbruz Ozen, Alex Orailoglu

Keywords: deep learning, information redundancy, pruning

Abstract: DNNs are abundantly represented in real-life applications because of their accuracy in challenging problems, yet their demanding memory and computational costs challenge their applicability to resource-constrained environments. Taming computational costs has hitherto focused on first-order techniques, such as eliminating numerically insignificant neurons/filters through numerical contribution metric prioritizations, yielding passable improvements. Yet redundancy in DNNs extends well beyond the limits of numerical insignificance. Modern DNN layers exhibit a significant correlation among output activations; hence, the number of extracted orthogonal features at each layer rarely exceeds a small fraction of the layer size. The exploitation of this observation necessitates the quantification of information content at layer outputs. To this end, we employ practical data analysis techniques coupled with a novel feature elimination algorithm to identify a minimal set of computation units that capture the information content of the layer and squash the rest. Linear transformations on the subsequent layer ensure accuracy retention despite the removal of a significant portion of the computation units. The one-shot application of the outlined technique can shrink the VGG-16 model size \(4.9{\times }\) and speed up its execution by \(3.4{\times }\) with negligible accuracy loss while requiring no additional fine-tuning. The proposed approach, in addition to delivering results overwhelmingly superior to hitherto promulgated heuristics, furthermore promises to spearhead the design of more compact deep learning models through an improved understanding of DNN redundancy.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ECML PKDD 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers