Abstract

We propose Convolutional Block Attention Module (CBAM), a simple yet effective attention module for feed-forward convolutional neural networks. Given an intermediate feature map, our module sequentially infers attention maps along two separate dimensions, channel and spatial, then the attention maps are multiplied to the input feature map for adaptive feature refinement. Because CBAM is a lightweight and general module, it can be integrated into any CNN architectures seamlessly with negligible overheads and is end-to-end trainable along with base CNNs. We validate our CBAM through extensive experiments on ImageNet-1K, MS COCO detection, and VOC 2007 detection datasets. Our experiments show consistent improvements in classification and detection performances with various models, demonstrating the wide applicability of CBAM. The code and models will be publicly available.

Keywords

Computer scienceFeature (linguistics)Block (permutation group theory)Convolutional neural networkPattern recognition (psychology)Code (set theory)Artificial intelligenceAlgorithmMathematicsProgramming language

Affiliated Institutions

Related Publications

Publication Info

Year
2018
Type
book-chapter
Pages
3-19
Citations
20102
Access
Closed

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

20102
OpenAlex
1870
Influential

Cite This

Sanghyun Woo, Jongchan Park, Joon‐Young Lee et al. (2018). CBAM: Convolutional Block Attention Module. Lecture notes in computer science , 3-19. https://doi.org/10.1007/978-3-030-01234-2_1

Identifiers

DOI
10.1007/978-3-030-01234-2_1
PMID
41292690
PMCID
PMC12640887
arXiv
1807.06521

Data Quality

Data completeness: 79%