Abstract

Both accuracy and efficiency are of significant importance to the task of semantic segmentation. Existing deep FCNs suffer from heavy computations due to a series of high-resolution feature maps for preserving the detailed knowledge in dense estimation. Although reducing the feature map resolution (i.e., applying a large overall stride) via subsampling operations (e.g., polling and convolution striding) can instantly increase the efficiency, it dramatically decreases the estimation accuracy. To tackle this dilemma, we propose a knowledge distillation method tailored for semantic segmentation to improve the performance of the compact FCNs with large overall stride. To handle the inconsistency between the features of the student and teacher network, we optimize the feature similarity in a transferred latent domain formulated by utilizing a pre-trained autoencoder. Moreover, an affinity distillation module is proposed to capture the long-range dependency by calculating the non local interactions across the whole image. To validate the effectiveness of our proposed method, extensive experiments have been conducted on three popular benchmarks: Pascal VOC, Cityscapes and Pascal Context. Built upon a highly competitive baseline, our proposed method can improve the performance of a student network by 2.5% (mIOU boosts from 70.2 to 72.7 on the cityscapes test set) and can train a better compact model with only 8% float operations (FLOPS) of a model that achieves comparable performances.

Keywords

Pascal (unit)Computer scienceSegmentationArtificial intelligenceFLOPSFeature (linguistics)Context (archaeology)Domain knowledgeMachine learningPattern recognition (psychology)

Affiliated Institutions

Related Publications

Relational Knowledge Distillation

Knowledge distillation aims at transferring knowledge acquired in one model (a teacher) to another model (a student) that is typically smaller. Previous approaches can be expres...

2019 2019 IEEE/CVF Conference on Computer ... 1437 citations

Publication Info

Year
2019
Type
preprint
Pages
578-587
Citations
219
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

219
OpenAlex

Cite This

Tong He, Chunhua Shen, Zhi Tian et al. (2019). Knowledge Adaptation for Efficient Semantic Segmentation. , 578-587. https://doi.org/10.1109/cvpr.2019.00067

Identifiers

DOI
10.1109/cvpr.2019.00067