CBAM-SMK - Integrating Convolution Block Attention Module With Separable Multi-Resolution Kernels in D
CBAM-SMK - Integrating Convolution Block Attention Module With Separable Multi-Resolution Kernels in D
Keywords: Proficiency and expertise are required for radiologists to accurately detect brain tumors, a process that
Brain tumor requires a significant amount of time. Deep learning technologies are being used and more to automate the
Convolutional neural network diagnosis of brain tumors, resulting in outcomes that are more precise and efficient compared to previous
Image classification
methods. Attention-based models possess an advanced feature of dynamic refinement and amplification, which
Attention mechanism
improves their diagnostic capabilities. The efficacy of channel, spatial, or combined attention methods in
the convolutional block attention module (CBAM) for the classification of brain cancer has not yet been
investigated. This study integrated a neural network (CNN) with CBAM to categorize brain cancers by
highlighting pertinent characteristics and reducing interference. The model is trained and evaluated on a
dataset of 7023 brain MRI images collected from three publicly available sources including Figshare, SARTAJ
and Br35H comprising four classes namely glioma, meningioma, pituitary tumor and no tumor. The CBAM–
SMK model achieved superior performance compared to other deep learning methods such as conventional
CNNs and CNNs enhanced with Sparsity and Multiresolution (SM), with an accuracy of 97. 0%, a recall of
96. 73%, and a precision of 96. 61% on the identical dataset. The fusion of CBAM–SMK has successfully
captured the spatial context and enhanced the representation of features. This enables physicians to use it on
brain classification software platforms, thus improving clinical decision-making and the classification of brain
tumors.
∗ Corresponding author.
E-mail address: binishmc@[Link] (Binish M.C.).
[Link]
Received 17 June 2024; Received in revised form 13 July 2025; Accepted 2 August 2025
Available online 13 August 2025
1746-8094/© 2025 Elsevier Ltd. All rights are reserved, including those for text and data mining, AI training, and similar technologies.
Binish M.C. et al. Biomedical Signal Processing and Control 112 (2026) 108483
presents a novel opportunity to improve the extraction of discrimina- from a database of brain tumor imaging data, and advanced machine
tive features from MRI scans. learning algorithms including Support Vector Machines, Decision Trees,
The primary aim of this study is to develop and evaluate a CBAM– and Naïve Bayes were employed to detect tumors. The performance
SMK model that enhances both the diagnostic accuracy and computa- was assessed using multiple metrics, and Naïve Bayes demonstrated
tional efficiency of brain tumor classification systems. the maximum accuracy in terms of entropy, morphological, SIFT,
The following research questions are addressed in this study: and texture characteristics. Cherukuri et al. [15] proposed a multi-
level attention network that utilizes Xception as the main model. This
• What mechanisms enable attention to boost CNNs’ capability for network incorporates spatial and cross-channel attention mechanisms
extracting features during brain tumor classification? and was applied to MRI dataset.
• Does the proposed CBAM–SMK model lead to improved perfor- CNNs with attention mechanisms have produced remarkable re-
mance metrics compared to traditional deep learning classifi- sults in the field of image classification. To enhance the classification
cation methods, particularly for precision, recall, and accuracy results, it is crucial to possess a comprehensive comprehension of
metrics? CNNs and attention mechanisms. In this section, we will specifically
discuss common CNNs and the main attention methods used in image
This study aims to establish an advanced deep learning platform
categorization.
that enhances the diagnostic reliability, precision, and accuracy of brain
tumor diagnosis. The main contributions of this study are as follows:
2.1. Convolutional neural networks for brain tumor classification
• Channel and spatial attention abilities of CBAM–SMK unite with
multi-resolution kernels to enhance both feature representation CNNs that are extensively employed for the purpose of image cat-
and classification results. egorization [16]. Residual networks have been widely recognized for
• Testing confirms that CBAM–SMK delivers better performance their exceptional performance in picture categorization [17], as evi-
than traditional network models, achieving accuracy of 97.0%, denced by previous examples. Moreover, broader CNNs provide the
recall of 96.73%, and precision of 96.61%. capability to extract supplementary data, enhancing classification out-
• The proposed computational design enables clinical applications, comes’ accuracy [18]. Consequently, a residual architecture is em-
including those in limited-resource settings. ployed to expand the scope and collect reliable data for picture recog-
• This research integrates separable multi-resolution kernels with nition [19]. This may be illustrated in the fusion of ResNet with a
CBAM, an untried combination in brain tumor classification tasks. convolutional branch consisting of 3 × 3 and 1 × 1 convolutions,
which is employed to extract significant information for image clas-
Convolution Block Attention Module with separable multiresolution sification [20]. ResNeXt employed a homogeneous multiple-branch
Kernels (CBAM–SMK) architecture is an innovative technique devel- architecture to effectively represent the categorized image [21]. Fur-
oped to overcome the challenges outlined before. This model aims to thermore, including residual networks as components in a CNN has the
improve the emphasis on important picture features by combining the potential to improve the classifier’s capacity to generalize [22]. Utiliz-
Convolution Block Attention Module (CBAM) with Separable Multires-
ing multi-scale and residual blocks proved to be an effective method for
olution Kernels. Additionally, it also captures a broader range of details
combining different semantic information in image categorization [23].
at different sizes. The goal is two-pronged, namely to enhance the
One can enhance the memory capacity of a deep CNN used for image
model’s ability to diagnose brain tumors and to optimize computational
classification by incorporating residual learning techniques to combine
efficiency, thus making advanced diagnostic skills more accessible to
hierarchical features [24]. Generative adversarial networks utilized
everyone. The CBAM component enhances the model’s focus by iter-
a generative network to produce samples that closely resemble the
atively applying channel and spatial attention to identify important
training examples provided [14]. This was done to address the issue
features. The Separable Multiresolution Kernels provide the model with
of inadequate samples. Subsequently, a discriminative network was
a simultaneous and thorough understanding of the intricate details of
employed by the GAN to assess the veracity of the entire set of training
the image. This enhances the feature set by incorporating detailed and
data, with the aim of constructing a resilient classifier [14]. Further-
different representations.
more, graph convolutional networks demonstrate exceptional efficiency
The CBAM–SMK model is a groundbreaking invention in brain
in the identification of multi-label pictures [15].
tumor diagnostics that aims to revolutionize the standards of accu-
racy and efficiency. This study will thoroughly assess and compare
the model’s capabilities, possible clinical applications, and its overall 2.2. Attentional mechanisms for image classification
impact on the field of medical diagnostics, using rigorous examination
and existing standards. Deep convolutional neural networks may incur substantial com-
This paper is organized as follows: Section 2 reviews related re- putational overhead due to their reliance on deep or wide architec-
search, highlighting recent advances and setting the framework for the tures [25]. To address this issue, a method of focus emerges, the
proposed methodology. Section 3 completely discusses the CBAM–SMK attention strategy utilizes the acquired attributes of different network
architecture and design reasoning. Experimental setup, data prepa- components as weights to influence other components [26], aiming
ration, model training, and assessment measures are in Section 4. to acquire more significant sequential knowledge [27]. The current
Section 5 concludes with an analysis of the study’s implications for methodologies for attention can be categorized into two groups, namely
future research and clinical practice, opening the way for medical channel attention [28] and spatial attention methods [28]. The channel
imaging and diagnostics advances. attention technique specifically focuses on the impact of channel char-
acteristics on the full set of CNN. When employing the second attention
2. Related works strategy, pixels of all dimensions that occupy the same position are
considered as a unified entity, and the significance of the method is
Deep learning and machine learning are the primary methods used acquired by evaluating each pixel at every location. Every weight is
to classify brain tumors [9]. Machine learning studies have utilized obtained from a spatial attention mesh.
KNN [10], SVM [11], decision trees [12], and evolutionary algo- Recent studies have introduced advanced deep learning models
rithms [13]. To improve the identification of brain tumors, Hussain for brain tumor classification by incorporating attention mechanisms,
et al. [14] proposed a multiple-modal approach for extracting features graph learning, and fusion strategies. Subba and Sunaniya [29] pro-
and utilizing machine learning methods. Multiple features were derived posed a computationally optimized GoogLeNet-style CNN enhanced
2
Binish M.C. et al. Biomedical Signal Processing and Control 112 (2026) 108483
3
Binish M.C. et al. Biomedical Signal Processing and Control 112 (2026) 108483
4
Binish M.C. et al. Biomedical Signal Processing and Control 112 (2026) 108483
The CBAM attention mechanism sequentially applies channel and spa- attention mechanisms, the use of separable multi-resolution kernels en-
tial attentions to refine the feature maps represented in Eqs. (9) and hances the model’s ability to extract discriminative features at multiple
(10) spatial scales with reduced computational overhead. This dual-level
enhancement improves feature sensitivity to subtle tumor structures,
𝐹 ′ = 𝑀𝑐 (𝐹 ) ⊗ 𝐹 (9) especially in complex and heterogeneous MRI data. Unlike existing
models that apply attention or multiscale convolutions independently,
𝐹 ′′ = 𝑀𝑠 (𝐹 ′ ) ⊗ 𝐹 ′ (10) CBAM–SMK synergistically combines both components in a unified
pipeline.
Batch normalization, is applied in each layer of the proposed
CBAM–SMK model, serves to stabilize the learning process and enhance
4. Experimental results
convergence rates. This operation, delineated in Eq. (11), normalizes
the activations by scaling and shifting based on the learned parameters This section presents the quantitative evaluation of the CBAM–SMK
𝛾𝑘(𝑙) and 𝛽𝑘(𝑙) , utilizing the mean 𝜇𝑘(𝑙) and variance 𝜎𝑘(𝑙)2 of the inputs, with model on brain MRI datasets, along with baseline comparisons against
𝜖 ensuring numerical stability. traditional CNN and CNN with SM architectures.
⎛ (𝑙−1) (𝑙) ⎞ 4.1. Quantitative metrics for CBAM-SMK model in brain tumor detection
(𝑙) ⎜ 𝐼𝑖,𝑗,𝑘 − 𝜇𝑘 ⎟
𝑂𝑖,𝑗,𝑘 = 𝛾𝑘(𝑙) ⎜ √ (𝑙)
⎟ + 𝛽𝑘 (11)
⎜ 𝜎 (𝑙)2 + 𝜖 ⎟ The CBAM–SMK model’s capability in detecting brain tumors is
⎝ 𝑘 ⎠ quantitatively evaluated through the following inventive metrics:
The dense layers in the CBAM–SMK architecture, culminating in the This section delineates the key metrics used to quantify the perfor-
output layer, leverage the softmax function to map the logits to a proba- mance of our model in classifying brain tumors. The model’s predictive
bility distribution over 4 distinct classes. This is crucial for classification outcomes are categorized into four distinct groups: True Positives ( +),
tasks, where the probability 𝑃 (𝑐𝑘 |𝐼) of the network predicting class True Negatives ( −), False Positives ( +), and False Negatives ( −).
𝑘 for a given input image 𝐼 is calculated using the softmax function,
as shown in Eq. (12) and the CBAM–SMK architecture algorithm as • Confusion Matrix: A foundational tool that encapsulates the
detailed in Algorithm 1. model’s predictions, facilitating a deeper understanding of its
performance through the categorization of predictions into +,
(𝑙)
𝑒𝑂𝑘 −, +, and −.
𝑃 (𝑐𝑘 |𝐼) = (12)
∑ 𝑂𝑗(𝑙) • Accuracy: This metric gauges the proportion of correctly identi-
𝑗𝑒 fied predictions, encompassing both positive and negative cases,
Algorithm 1 CBAM-SMK Model for Brain Tumor MRI Classification thus offering a snapshot of the model’s overall efficacy.
Input: Brain MRI image 𝐼 ∈ R224×224×3 Predicted tumor class 𝑌 + + −
Accuracy = (13)
Preprocessing: + + + + − + −
𝐼−𝐼𝑚𝑖𝑛
Normalize intensities using Eq. (1): 𝐼𝑛𝑜𝑟𝑚 = 𝐼 −𝐼
𝑚𝑎𝑥 𝑚𝑖𝑛 • Precision: Precision quantifies the model’s accuracy in predicting
Apply Perona-Malik diffusion to reduce noise while preserving edges positive cases as such, thereby assessing its ability to minimize
(Eq. 3–4) false alarms in brain tumor classification.
Feature Extraction: +
Apply initial convolution: 𝐹0 = Conv2D(𝐼𝑛𝑜𝑟𝑚 , 32 filters, 7 × 7) Precision = (14)
+ + +
Downsample: 𝐹1 = MaxPooling(𝐹0 , 2 × 2)
Apply Depthwise Separable Convolution (Eq. 7–8): 𝐹2 = • Recall: Also known as Sensitivity, Recall measures the model’s
SeparableConv(𝐹1 ) capability to correctly identify all actual positive cases, a critical
aspect in medical diagnostics where missing a positive case can
Downsample: 𝐹3 = MaxPooling(𝐹2 , 2 × 2)
have dire consequences.
Attention Mechanism:
Apply Channel Attention: 𝐹𝑐𝑎 = 𝑀𝑐 (𝐹3 ) ⊗ 𝐹3 (Eq. 9) +
Recall = (15)
Apply Spatial Attention: 𝐹𝑠𝑎 = 𝑀𝑠 (𝐹𝑐𝑎 ) ⊗ 𝐹𝑐𝑎 (Eq. 10) + + −
Classification: • F1-Score: The F1-Score harmonizes Precision and Recall into a
Flatten the feature map: 𝑉 = Flatten(𝐹𝑠𝑎 ) singular metric, offering a balanced view of the model’s ability to
Apply fully connected layers and softmax activation (Eq. 12): precisely identify positive cases while ensuring minimal omission
𝑌 = Softmax(Dense(𝑉 )) of actual positives.
Output: 𝑌 Precision × Recall
𝐹 1-𝑆𝑐𝑜𝑟𝑒 = 2 × (16)
The step-by-step progress of CBAM–SMK methodology implemen- Precision + Recall
tation is shown in Fig. 4. Data preparation commences with intensity • Matthews Correlation Coefficient (MCC): MCC provides a holis-
normalization procedures followed by Perona–Malik diffusion to pro- tic measure of the model’s performance, taking into account
duce optimal data quality for input images. After preprocessing input all categories of the confusion matrix, thus serving as a robust
images, the implementation performs feature extraction through con- indicator of the model’s predictive strength.
volutional layers and separable multi-resolution kernels to identify
detailed patterns. Feature maps are refined by these two built-in mech- ( + × −) − ( + × −)
𝑀𝐶𝐶 = √
anisms which operate at both channel and spatial levels within the ( + + +) × ( + + −) × ( − + +) × ( − + −)
CBAM module. Finally, dense layers and a softmax classifier output
(17)
probabilities for the four classes: glioma, meningioma, pituitary, and
no tumor. A systematic framework creates an effective system which • Classification Success Index (CSI): CSI specifically focuses on
guarantees precise brain tumor identification. the model’s success in correctly predicting positive cases, offer-
The novelty of the proposed CBAM–SMK architecture lies in the ing a targeted view of its diagnostic accuracy in brain tumor
strategic integration of CBAM with separable multi-resolution kernels, classification.
an approach not previously applied to brain tumor classification tasks. +
𝐶𝑆𝐼 = (18)
While CBAM enables adaptive focus through both channel and spatial + + + + −
5
Binish M.C. et al. Biomedical Signal Processing and Control 112 (2026) 108483
Fig. 5. Learning curves of CNN models for brain tumor classification. The top row depicts the accuracy of the models: (a) Standard CNN, (b) CNN with SM, and (c) CBAM–SMK.
The bottom row depicts the loss of the models: (d) Standard CNN, (e) CNN with SM, and (f) CBAM–SMK.
These metrics, adorned with the specified symbols for the predic- Performance analysis
tive outcomes, constitute a comprehensive framework for evaluating
our model’s adeptness in classifying brain tumors, ensuring a thor- Fig. 5 depicts the performance of CNNs with varying levels of
ough assessment of its diagnostic capabilities. The experimental design architectural complexity in classifying brain tumors. The plots display
stages focused on addressing specific problems which affect brain tu- the accuracy and loss metrics of the models during the training epochs,
mor classification performance. The baseline CNN experiments helped illustrating their capacity to comprehend and apply knowledge from
establish performance groundwork while showing that important tu- complex neuroradiological images. The accuracy curve of the conven-
mor characteristics remain difficult to detect using this method. The tional CNN architecture, as depicted in Fig. 5(a), exhibits a steady
increase, reaching a value of 99.00%. As depicted in Fig. 5(d), the loss
addition of Sparsity and Multiresolution (SM) features improved ar-
curve gradually decreases to a minimal value, indicating a strong fit to
chitectural performance which proved beneficial to extract features
the training data without overfitting. This is also evident in the closely
that detect minor tumor features. Through the CBAM–SMK architecture
matched validation loss.
researchers implemented dynamic feature map refinement techniques
The CNN with SM (Sparsity and Multiresolution) model demon-
combining attention-based mechanisms alongside multi-resolution ker-
strates in Fig. 5(b) that the incorporation of sparsity and multires-
nels to enhance vital pattern detection alongside noise reduction. The olution techniques leads to a significant enhancement in CNN test
development sequence revealed increased accuracy and better perfor- accuracy, achieving a score of 99.50%. The loss curve depicted in
mance from CBAM–SMK’s 97.0% precision and recall alongside 96.61% Fig. 5(e) exhibits a minimal disparity between the training and valida-
precision and 96.73% recall numbers. tion loss, suggesting improved feature extraction across several scales,
These metrics, equipped with unique symbols for each classification which is beneficial for detecting the delicate symptoms of brain tumors.
outcome, provide a robust framework for assessing the effectiveness of Fig. 5(c) depicts the model’s test accuracy of 99.90% following the
the CBAM–SMK model in brain tumor detection. integration CBAM–SMK. This rise demonstrates the efficacy of attention
6
Binish M.C. et al. Biomedical Signal Processing and Control 112 (2026) 108483
Table 1
Performance analysis of proposed models for brain tumor classification.
Precision (%)
Accuracy (%)
F1-score (%)
Recall (%)
CSI (%)
MCC
Model
Fig. 6. Graphical representation of the performance analysis for CNN models in brain tumor classification.
mechanisms in neural networks for directing attention on tumor charac- its improved ability to predict true positives while considering all
teristics in magnetic resonance imaging. The loss curve depicted in Fig. types of prediction errors. Additionally, it records the highest MCC,
5(f) demonstrates that the CBAM technique enhances the diagnostic demonstrating a strong correlation between the model’s predictions and
precision of the model by reducing the disparity between training and the actual classifications. These indicators combined demonstrate the
validation losses. significant influence of CBAM on improving the model’s capacity to
Table 1 showcases the performance metrics for different transfer accurately and reliably classify brain tumors.
learning models applied to our proposed system. These metrics include This study conducts a comparative examination of three CNN mod-
Accuracy, Precision, Recall, F1-score, Matthews Correlation Coefficient els by examining their confusion matrices is presented in Fig. 8. The
(MCC), and the Classification Success Index (CSI). analysis reveals the effectiveness of these models in accurately classi-
The proposed architecture involves the utilization of three con- fying four different types of brain tumors. The initial CNN model has a
volutional neural network models on a dataset of brain tumors to notable level of accuracy in identifying glioma and pituitary cases, cor-
optimize the classification process. Table 1 and Fig. 6 in the study rectly identifying 290 glioma cases and 347 pituitary instances. How-
depict the evaluation of performance and assessment for the imple-
ever, it does misclassify 17 glioma cases as meningiomas. The second
mented CNN architectures. Fig. 6 displays the performance metrics that
model, which combines Sparsity and Multiresolution, demonstrates im-
we have observed. The CNN, CNN with sparsity and multiresolution
proved performance, especially in the categorization of meningiomas.
(SM), and CNN–SMK achieved accuracy rates of 95.00%, 96.00%, and
It achieves 305 true positives and somewhat reduces the misclassifi-
97.00% respectively. The precision rates for these models were 95.57%,
cation of gliomas as meningiomas to 16. The CBAM–SMK, the most
96.07%, and 96.61%, while the recall rates were 95.44%, 96.36%, and
96.73%. The f1-score rates for the CNN, CNN with SM, and CBAM– sophisticated model in the series, exhibits exceptional discrimination
SMK were 95.51%, 96.22%, and 96.67% respectively. Out of all the capabilities. It achieves 295 true positives for glioma, 305 for menin-
models, the CBAM–SMK shows exceptional effectiveness, attaining the gioma, 413 for no tumor, and 347 for pituitary tumors. Additionally,
highest accuracy rate of 97.00%, a precision rate of 96.61%, a recall it consistently reduces misclassifications of glioma to only 14 cases.
rate of 96.73%, and a f1-score rate of 96.67%. The results highlight The matrices demonstrate the step-by-step enhancement in accuracy of
the efficacy of attention processes, specifically implemented into the tumor classification by including SM and CBAM–SMK techniques. This
CBAM–SMK, in improving the model’s ability to accurately classify indicates the significant influence of these techniques on improving the
brain tumors. Fig. 6 also displays the Classification Success Index reliability of diagnosis in neural network applications.
(CSI). The CSI values are recorded as 91.40%, 92.71%, and 93.55%. Fig. 10 displays the Receiver Operating Characteristic (ROC) and
According to Fig. 7, the MCC values for the basic CNN, CNN with SM, Precision–Recall (P–R) curves for three CNN designs. These curves
and CBAM–SMK are 0.94, 0.95, and 0.96, respectively. The CBAM– showcase the effect of employing modern approaches in the categoriza-
SMK surpasses other models by achieving the highest CSI, indicating tion of brain tumors. The conventional CNN serves as a reference point,
7
Binish M.C. et al. Biomedical Signal Processing and Control 112 (2026) 108483
Fig. 8. Confusion Matrices (a) CNN (b) CNN with SM (c) CBAM–SMK.
and its performance is significantly enhanced by the SM-enhanced To enhance the evaluation of the proposed models, an additional
CNN, as evidenced by the advancement of the curve. The CNN im- analysis was conducted using a newly incorporated Figshare dataset,
consisting of 300 glioma, 306 meningioma, 300 pituitary tumor, and
proved with CBAM demonstrates the highest performance, as its curves
405 no tumor samples. This dataset provides a more comprehensive and
approach the optimal top-left corner in the ROC space and the top
diverse representation of brain tumor cases. The inclusion of updated
right corner in the P–R space. This indicates higher accuracy and confusion matrices in Fig. 9 and performance metrics in Table 1 fur-
dependability. ther substantiates the robustness and generalizability of the proposed
8
Binish M.C. et al. Biomedical Signal Processing and Control 112 (2026) 108483
Fig. 9. Confusion Matrices for CNN, CNN–SM, and CBAM–SMK Models for new Figshare Data.
Fig. 10. ROC and P–R curves of CNN models for brain tumor classification. The top row depicts the ROC curves of the models: (a) Standard CNN, (b) CNN with SM, and (c)
CBAM–SMK. The bottom row depicts the P–R curves of the models: (d) Standard CNN, (e) CNN with SM, and (f) CBAM–SMK.
models. Specifically, the CBAM–SMK model demonstrated superior Table 3 presents a comparative performance analysis of the CBAM–
performance compared to CNN and CNN–SM across both the original SMK model against three well-established deep learning models such as
and new datasets, consistently achieving high accuracy. These results ResNet-50, AlexNet, and MobileNet on two MRI datasets: Old Figshare
underscore the effectiveness of CBAM–SMK in advancing brain tumor Data and New Figshare Data. The models were evaluated using stan-
classification. dard classification metrics, including accuracy, CSI, precision, recall,
Following the overall performance analysis, we present the detailed F1-score, and MCC. These metrics provide a comprehensive evaluation
class-wise performance metrics for the CBAM–SMK, CNN, and CNN of model performance in terms of both the ability to correctly clas-
with SM models in Table 2. This table shows how each model performs sify brain tumors (accuracy, precision, recall) and the overall balance
on the four tumor classes: Glioma, Meningioma, Pituitary, and No between positive and negative classifications (MCC, CSI).
The CBAM–SMK model demonstrates superior performance across
Tumor. The results provide insight into each model’s ability to detect
all evaluation metrics, with accuracy reaching 97.1% and MCC of 0.95,
specific tumor types.
outperforming the baseline models, including ResNet-50 (94.4%) and
We then compare the performance of the CBAM–SMK model with
MobileNet (93.5%) on both datasets. The enhanced performance of
several baseline architectures, including ResNet-50, AlexNet, and Mo-
CBAM–SMK is attributed to the integration of CBAM and SMK, which
bileNet, using the same dataset and evaluation metrics. The comparison
enable more precise feature extraction and greater emphasis on tumor-
results are summarized in Table 3. related regions in MRI scans. This results in improved classification
results, especially in identifying subtle features, and offers a more
5. Discussion computationally efficient solution compared to traditional CNN-based
architectures.
The results indicate that CBAM–SMK significantly improves classifi- To validate the performance improvements observed in our model,
cation accuracy, likely due to the synergy between spatial and channel we conducted a statistical significance analysis. We performed paired t-
attention and multi-resolution feature representation. tests on key performance metrics, including accuracy, precision, recall,
9
Binish M.C. et al. Biomedical Signal Processing and Control 112 (2026) 108483
Table 2
Class-wise performance metrics for CBAM–SMK, CNN, and CNN with SM models.
Metric Model Glioma Meningioma Pituitary No Tumor
CBAM–SMK 98.5 97.2 96.7 97.8
Accuracy (%) CNN 95.0 94.0 94.2 95.2
CNN with SM 96.0 94.5 94.1 96.3
CBAM–SMK 98.0 96.5 97.2 98.1
Precision (%) CNN 94.3 93.0 93.5 94.6
CNN with SM 95.8 94.7 93.8 95.8
CBAM–SMK 97.8 97.0 96.5 97.9
Recall (%) CNN 95.5 94.5 94.1 95.3
CNN with SM 96.2 95.0 94.2 96.1
CBAM–SMK 97.9 96.7 96.8 98.0
F1-score (%) CNN 94.9 93.7 93.8 94.9
CNN with SM 95.9 94.8 94.0 96.0
CBAM–SMK 0.95 0.92 0.91 0.93
MCC CNN 0.91 0.86 0.88 0.90
CNN with SM 0.92 0.87 0.89 0.91
Table 3
Performance comparison with baseline models.
Precision (%)
Accuracy (%)
F1-score (%)
Recall (%)
CSI (%)
MCC
Model
Table 4
Statistical significance analysis of CBAM–SMK performance vs. baseline models.
Model Accuracy Precision Recall F1-score
Comparison p-value p-value p-value p-value
ResNet-50 vs. 0.002 0.005 0.003 0.004
CBAM–SMK
AlexNet vs. 0.001 0.004 0.002 0.003
CBAM–SMK
MobileNet vs. 0.003 0.006 0.004 0.005
CBAM–SMK
Table 5
Comparative analysis of brain tumor classification with state-of-the-art models.
Author Architecture Accuracy (%)
Abiwinanda et al. [32] CNN 84.19
Afshar et al. [33] CapsNet 86.56
Paul et al. [34] CNN 91.43
Das et al. [35] CNN 94.39
Ayadi et al. [4] CNN 94.74
Swati et al. [36] VGG19 94.82
Saxena et al. [37] ResNet-50 95.0
Senan et al. [9] SVM 95.1
Khan et al. [38] VGG16 96.0
Badža and Barjaktarović [39] CNN 96.56
Kaya et al. [31] Fusion-Brain-Net 97.56
Proposed CBAM–SMK 97.00
and F1-score, comparing CBAM–SMK with baseline models: ResNet- A variety of CNN architectures have been explored in the evolv-
50, AlexNet, and MobileNet. The p-values obtained from these tests ing field of brain tumor classification, with accuracy levels varying
are presented in Table 4. As shown, all metrics exhibit statistically across models. As shown in Table 5, earlier works such as Abiwinanda
significant differences (with p-values less than 0.05), confirming that et al. [32] reported a conventional CNN achieving 84.19% accuracy,
the improvements offered by CBAM–SMK are not due to random chance laying the groundwork for subsequent advancements. Afshar et al. [33]
but are statistically meaningful. introduced CapsNet, which improved accuracy to 86.56%, while Paul
10
Binish M.C. et al. Biomedical Signal Processing and Control 112 (2026) 108483
et al. [34] and Das et al. [35] further refined CNN models, achieving CRediT authorship contribution statement
91.43% and 94.39%, respectively. Swati et al. [36] applied VGG19, at-
taining 94.82%, highlighting the promising use of pre-trained networks Binish M.C.: Writing – review & editing, Writing – original draft,
in medical imaging. Recent models like Badža and Barjaktarović [39] Validation, Software, Project administration, Methodology, Data cura-
achieved 96.56% with their CNN. In comparison, the Fusion-Brain-Net tion, Conceptualization. Swarun Raj R.S.: Resources. Vinu Thomas:
model by Kaya et al. [31] demonstrated a slightly higher accuracy of Supervision.
97.56%, showcasing the improvements in brain tumor classification
with advanced fusion techniques. The CBAM–SMK model presented
Declaration of competing interest
in this study outperforms these models, achieving 97.00% accuracy.
This highlights the importance of integrating advanced attention mech-
anisms with multi-resolution feature extraction to improve diagnostic The authors declare that they have no known competing finan-
precision in brain tumor classification. cial interests or personal relationships that could have appeared to
Merits of the Proposed Model: influence the work reported in this paper.
11
Binish M.C. et al. Biomedical Signal Processing and Control 112 (2026) 108483
[18] A. Kumar, J. Kim, D. Lyndon, M. Fulham, D. Feng, An ensemble of fine-tuned [29] A.B. Subba, A.K. Sunaniya, Computationally optimized brain tumor classification
convolutional neural networks for medical image classification, IEEE J. Biomed. using attention based GoogLeNet-style CNN, Expert Syst. Appl. 260 (2025)
Heal. Informatics 21 (1) (2016) 31–40. 125443.
[19] M. Shafiq, Z. Gu, Deep residual learning for image recognition: A survey, Appl. [30] E. Gürsoy, Y. Kaya, Brain-gcn-net: Graph-convolutional neural network for brain
Sci. 12 (18) (2022) 8972. tumor identification, Comput. Biol. Med. 180 (2024) 108971.
[20] J. Cao, C. Hu, L. Kong, Z. Yu, Expression recognition based on multi-level multi- [31] Y. Kaya, E. Akat, S. Yıldırım, Fusion-brain-net: A novel deep fusion model for
model fusion deep convolutional neural network, Highlights Sci. Eng. Technol. brain tumor classification, Brain Behav. 15 (5) (2025) e70520.
34 (2023) 232–237. [32] N. Abiwinanda, M. Hanif, S.T. Hesaputra, A. Handayani, T.R. Mengko, Brain
[21] S. Cong, Y. Zhou, A review of convolutional neural network architectures and tumor classification using convolutional neural network, in: World Congress on
their optimizations, Artif. Intell. Rev. 56 (3) (2023) 1905–1969. Medical Physics and Biomedical Engineering 2018 (Vol. 1), June 3-8, 2018,
[22] N. Chouhan, A. Khan, et al., Network anomaly detection using channel boosted Prague, Czech Republic, Springer, 2019, pp. 183–189.
and residual learning based deep convolutional neural network, Appl. Soft [33] P. Afshar, A. Mohammadi, K.N. Plataniotis, Brain tumor type classification
Comput. 83 (2019) 105612. via capsule networks, in: 2018 25th IEEE International Conference on Image
[23] Y. Wang, D. Qi, C. Zhao, Part-based multi-scale attention network for text-based Processing, ICIP, IEEE, 2018, pp. 3129–3133.
person search, in: Chinese Conference on Pattern Recognition and Computer [34] J.S. Paul, A.J. Plassard, B.A. Landman, D. Fabbri, Deep learning for brain tumor
Vision, PRCV, Springer, 2022, pp. 462–474. classification, in: Medical Imaging 2017: Biomedical Applications in Molecular,
[24] H. Huang, C. Pu, Y. Li, Y. Duan, Adaptive residual convolutional neural network Structural, and Functional Imaging, vol. 10137, SPIE, 2017, pp. 253–268.
for hyperspectral image classification, IEEE J. Sel. Top. Appl. Earth Obs. Remote. [35] S. Das, O.R.R. Aranya, N.N. Labiba, Brain tumor classification using convolutional
Sens. 13 (2020) 2520–2531. neural network, in: 2019 1st International Conference on Advances in Science,
[25] K. Yang, T. Xing, Y. Liu, Z. Li, X. Gong, X. Chen, D. Fang, cDeepArch: A compact Engineering and Robotics Technology, ICASERT, IEEE, 2019, pp. 1–5.
deep neural network architecture for mobile sensing, IEEE/ACM Trans. Netw. 27 [36] Z.N.K. Swati, Q. Zhao, M. Kabir, F. Ali, Z. Ali, S. Ahmed, J. Lu, Brain tumor
(5) (2019) 2043–2055. classification for MR images using transfer learning and fine-tuning, Comput.
[26] X. Wang, X. He, Y. Cao, M. Liu, T.-S. Chua, Kgat: Knowledge graph atten- Med. Imaging Graph. 75 (2019) 34–46.
tion network for recommendation, in: Proceedings of the 25th ACM SIGKDD [37] P. Saxena, A. Maheshwari, S. Maheshwari, Predictive modeling of brain tumor:
International Conference on Knowledge Discovery & Data Mining, 2019, pp. a deep learning approach, in: Innovations in Computational Intelligence and
950–958. Computer Vision: Proceedings of ICICV 2020, Springer, 2020, pp. 275–285.
[27] A. Bushara, R.V. Kumar, S. Kumar, An ensemble method for the detection and [38] N. Çınar, B. Kaya, M. Kaya, Comparison of deep learning models for brain tumor
classification of lung cancer using Computed Tomography images utilizing a classification using MRI images, in: 2022 International Conference on Decision
capsule network with Visual Geometry Group, Biomed. Signal Process. Control. Aid Sciences and Applications, DASA, IEEE, 2022, pp. 1382–1385.
85 (2023) 104930. [39] M.M. Badža, M. Barjaktarović, Classification of brain tumors from MRI images
[28] Q. Wang, B. Wu, P. Zhu, P. Li, W. Zuo, Q. Hu, ECA-Net: Efficient chan- using a convolutional neural network, Appl. Sci. 10 (6) (2020) 1999.
nel attention for deep convolutional neural networks, in: Proceedings of the
IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2020, pp.
11534–11542.
12