22/11/2021

AttNAS: Searching Attentions for Lightweight Semantic Segmentation

Yingying Jiang, Zhuoxin Gan, ), Ke Lin, Yong A

Keywords: Neural architecture search, semantic segmentation, AttNAS, lightweight, attention

Abstract: To use multiple attentions in semantic segmentation task, attentions are generally arranged in parallel and then combined by concatenation or summation. In this work, we use Neural Architecture Search (NAS) to explore attentions and their combination patterns (e.g., parallel, hybrid parallel+sequential) in semantic segmentation. We propose AttNAS, which searches both backbone and attention for a lightweight semantic segmentation model. In particular, we define a new attention search space with a two-layer structure and propose a unified differentiable formulation to support both attention type search and attention combination search. Our experimental results show that the model searched with AttNAS achieves state-of-the-art compared to existing lightweight methods on Cityscapes, CamVid and Pascal VOC 2012. Moreover, the attention patterns obtained by AttNAS have robust generalization capability and can be used in combination with existing backbones, such as MobileNetV2 and ResNet18, to improve segmentation performance.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at BMVC 2021 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers