The Convolutional Block Attention Module (CBAM) has emerged as a widely adopted attention mechanism, as it seamlessly integrates into the Convolutional Neural Network (CNN) architecture with minimal computational overhead. However, its reliance on global average and maximum pooling in the channel and spatial attention modules leads to information loss, particularly in scenarios demanding fine-grained feature analysis, such as medical imaging. In this paper, we propose the Modified CBAM (MCBAM) to address this critical limitation. This novel framework eliminates the dependence on global pooling by introducing a sub-block pooling strategy that captures nuanced feature relationships, preserving critical spatial and channel-wise information. MCBAM iteratively computes attention maps along channel and spatial dimensions, adaptively refining features for superior representational power. Comprehensive evaluations on diverse datasets, including C-NMC (acute lymphoblastic leukemia), PCB (peripheral blood cells), and COVID-19 (Chest X-ray), demonstrate the efficacy of MCBAM. Additionally, we evaluate MCBAM against similar alternatives, such as the Bottleneck Attention Module (BAM), Normalisation-Based Attention Module (NAM), and Triplet Attention Module (TAM), demonstrating that MCBAM consistently outperforms these advanced attention mechanisms across all datasets and metrics. Furthermore, results reveal that MCBAM surpasses the standard CBAM and establishes itself as a robust and effective enhancement for attention mechanisms, with notable improvements in medical imaging tasks, offering critical advantages in complex scenarios
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.