A Caltech Library Service

BAM: a balanced attention mechanism to optimize single image super-resolution

Wang, Fanyi and Hu, Haotian and Shen, Cheng and Feng, Tianpeng and Guo, Yandong (2022) BAM: a balanced attention mechanism to optimize single image super-resolution. Journal of Real-Time Image Processing . ISSN 1861-8200. doi:10.1007/s11554-022-01235-x. (In Press)

[img] PDF - Submitted Version
Creative Commons Attribution.


Use this Persistent URL to link to this item:


Recovering texture information from the aliasing regions has always been a major challenge for single image super-resolution (SISR) task. These regions are often submerged in noise so that we have to restore texture details while suppressing noise. To address this issue, we propose an efficient Balanced Attention Mechanism (BAM), which consists of Avgpool Channel Attention Module (ACAM) and Maxpool Spatial Attention Module (MSAM) in parallel. ACAM is designed to suppress extreme noise in the large-scale feature maps, while MSAM preserves high-frequency texture details. Thanks to the parallel structure, these two modules not only conduct self-optimization, but also mutual optimization to obtain the balance of noise reduction and high-frequency texture restoration during the back propagation process, and the parallel structure makes the inference faster. To verify the effectiveness and robustness of BAM, we applied it to 10 state-of-the-art SISR networks. The results demonstrate that BAM can efficiently improve the networks' performance, and for those originally with attention mechanism, the substitution with BAM further reduces the amount of parameters and increases the inference speed. Information multi-distillation network (IMDN), a representative lightweight SISR network with attention, when the input image size is 200 × 200, the FPS of proposed IMDN-BAM precedes IMDN {8.1%, 8.7%, 8.8%} under the three SR magnifications of × 2, × 3, × 4, respectively. Densely residual Laplacian network (DRLN), a representative heavyweight SISR network with attention, when the scale is 60 × 60, the proposed DRLN-BAM is {11.0%, 8.8%, 10.1%} faster than DRLN under × 2, × 3, × 4. Moreover, we present a dataset with rich texture aliasing regions in real scenes, named realSR7. Experiments prove that BAM achieves better super-resolution results on the aliasing area.

Item Type:Article
Related URLs:
URLURL TypeDescription Paper ItemSource code
Wang, Fanyi0000-0002-8685-4300
Alternate Title:BAM: A Lightweight and Efficient Balanced Attention Mechanism for Single Image Super Resolution
Additional Information:© 2022 Springer Nature. Received 25 April 2022. Accepted 22 June 2022. Published 26 July 2022. The authors would like to thank the Associate Editor and the Reviewers for their constructive comments. This work is supported by OPPO Research Institute. And this work is pre-print at Availability of code and data. The source code is released at
Funding AgencyGrant Number
OPPO Research InstituteUNSPECIFIED
Subject Keywords:Single image super-resolution; Texture aliasing; Inference acceleration; Lightweight attention mechanism
Record Number:CaltechAUTHORS:20220726-997394000
Persistent URL:
Official Citation:Wang, F., Hu, H., Shen, C. et al. BAM: a balanced attention mechanism to optimize single image super-resolution. J Real-Time Image Proc (2022).
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:115851
Deposited By: George Porter
Deposited On:27 Jul 2022 21:46
Last Modified:27 Jul 2022 21:46

Repository Staff Only: item control page