22/11/2021

TinyStarGAN v2: Distilling StarGAN v2 for Efficient Diverse Image Synthesis for Multiple Domains

PARAS KAPOOR, Tien D. Bui

Keywords: knowledge distillation, gan compression, multi domain image synthesis, deep neural network compression, efficient gans

Abstract: Image-to-image translation models, such as StarGAN v2, have enabled the translation of diverse images over multiple domains in a single framework. However, StarGAN v2 is computationally expensive making it challenging to execute on resource-constrained environments. To reduce the computation requirement of StarGAN v2 while maintaining accuracy, we propose a novel cross distillation method that is specially designed for knowledge distillation (KD) of multiple networks in a single framework. By leveraging this new KD method, the knowledge of a multi-network large teacher StarGAN v2 can be effectively transferred to a small student TinyStarGAN v2 framework. Without losing the quality and diversity of generated images, we reduce the size of the original framework by more than 20× and the computation requirement by more than 5×. Experiments on CelebA-HQ and AFHQ datasets show the effectiveness of the proposed method.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at BMVC 2021 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers