Explainable Multi-Module Semantic Guided Attention Network for Accurate Medical Image Segmentation
DOI:
https://doi.org/10.22399/ijcesen.2063Keywords:
Medical image segmentation, deep learning, Explainability, Attention mechanism, Semantic guidance, AccuracyAbstract
Accurate medical image segmentation is of utmost importance in a wide range of clinical applications, playing a vital role in disease diagnosis and treatment planning. This research presents the application of the Explainable Multi-Module Semantic Guided Attention Network (EM-SGAN) with the optimization technique of unbounded variance Adaptive Moment Estimation (AMSGrad) for breast cancer image segmentation. EM-SGAN is a deep learning model that integrates multiple modules to enhance the accuracy and interpretability of the segmentation process. The key components of EM-SGAN include an encoder-decoder framework, attention mechanism, semantic guidance module, and explainability module. By incorporating the AMSGrad optimizer, which addresses the unboundedness issue of the second-moment estimate, EM-SGAN achieves stable convergence and improved optimization. Experimental evaluations on breast cancer image segmentation tasks demonstrate the effectiveness of EM-SGAN with unbounded variance AMSGrad in accurately segmenting cancerous regions. The proposed approach significantly advances the field of medical image segmentation by offering a dependable and understandable solution for breast cancer analysis.
References
[1] Tiwari, M., Bharuka, R., Shah, P., & Lokare, R. (2020). Breast cancer prediction using deep learning and machine learning techniques. SSRN. https://doi.org/10.2139/ssrn.3558786
[2] Johnson, B., et al. (2021). Breast cancer segmentation using U-Net and conditional generative adversarial network. International Journal of Computer Vision, 137(3), 257–270.
[3] Lee, C., et al. (2020). Breast cancer segmentation using DeepLabv3+. IEEE Transactions on Medical Imaging, 39(5), 1533–1543.
[4] Chen, D., et al. (2016). Breast cancer segmentation using active contour models. Medical Image Analysis, 33, 15–26.
[5] Wang, L., et al. (2018). Multi-resolution U-Net for breast cancer segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention (pp. 683–690). Springer.
[6] Zhang, Y., et al. (2019). Breast cancer segmentation using region growing. Journal of Medical Imaging and Health Informatics, 9(2), 211–218.
[7] Liu, G., et al. (2017). Breast cancer segmentation using fully convolutional networks. Journal of Medical Systems, 41(5), 85.
[8] Kim, S., et al. (2017). Breast cancer segmentation using graph cut. Medical Physics, 44(12), 6459–6471.
[9] Zhou, H., et al. (2018). Breast cancer segmentation using DenseUNet. Computers in Biology and Medicine, 101, 1–11.
[10] Gupta, S., et al. (2018). Breast cancer segmentation using 3D U-Net. IEEE Access, 6, 50587–50596.
[11] Salama, W. M., & Aly, M. H. (2021). Deep learning in mammography images segmentation and classification: Automated CNN approach. Alexandria Engineering Journal, 60(5), 4701–4709.
[12] Sun, W., Tseng, T. L. B., Zhang, J., & Qian, W. (2017). Enhancing deep convolutional neural network scheme for breast cancer diagnosis with unlabeled data. Computerized Medical Imaging and Graphics, 57, 4–9.
[13] Geras, K. J., Wolfson, S., Shen, Y., Wu, N., Kim, S., Kim, E., Heacock, L., Parikh, U., Moy, L., & Cho, K. (2017). High-resolution breast cancer screening with multi-view deep convolutional neural networks. arXiv preprint arXiv:1703.07047.
[14] Rahman, H., Naik Bukht, T. F., Ahmad, R., Almadhor, A., & Javed, A. R. (2023). Efficient breast cancer diagnosis from complex mammographic images using deep convolutional neural network. Computational Intelligence and Neuroscience, 2023, Article ID 4471683.
[15] Dembrower, K., Liu, Y., Azizpour, H., Eklund, M., Smith, K., Lindholm, P., & Strand, F. (2020). Comparison of a deep learning risk score and standard mammographic density score for breast cancer risk prediction. Radiology, 294(2), 265–272.
[16] Arevalo, J., González, F. A., Ramos-Pollán, R., Oliveira, J. L., & Lopez, M. A. G. (2016). Representation learning for mammography mass lesion classification with convolutional neural networks. Computer Methods and Programs in Biomedicine, 127, 248–257.
[17] Yurttakal, A. H., Erbay, H., İkizceli, T., & Karaçavuş, S. (2020). Detection of breast cancer via deep convolution neural networks using MRI images. Multimedia Tools and Applications, 79, 15555–15573.
[18] Shi, P., Wu, C., Zhong, J., & Wang, H. (2019, August). Deep learning from small dataset for BI-RADS density classification of mammography images. In 2019 10th International Conference on Information Technology in Medicine and Education (ITME) (pp. 102–109). IEEE.
[19] Abdelrahman, L., Al Ghamdi, M., Collado-Mesa, F., & Abdel-Mottaleb, M. (2021). Convolutional neural networks for breast cancer detection in mammography: A survey. Computers in Biology and Medicine, 131, 104248.
[20] Maqsood, S., Damaševičius, R., & Maskeliūnas, R. (2022). TTCNN: A breast cancer detection and classification towards computer-aided diagnosis using digital mammography in early stages. Applied Sciences, 12(7), 3273.
[21] Giger, M. L. (2018). Machine learning in medical imaging. Journal of the American College of Radiology, 15(3), 512–520.
[22] Ajantha Devi, V., & Nayyar, A. (2021). Fusion of deep learning and image processing techniques for breast cancer diagnosis. In Deep Learning for Cancer Diagnosis (pp. 1–25).
[23] Ker, J., Wang, L., Rao, J., & Lim, T. (2017). Deep learning applications in medical image analysis. IEEE Access, 6, 9375–9389.
[24] UtkarshSaxenaDN. (n.d.). Breast cancer image segmentation attention U-Net [Dataset]. Kaggle. https://www.kaggle.com/code/utkarshsaxenadn/breast-cancer-image-segmentation-attention-unet
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 International Journal of Computational and Experimental Science and Engineering

This work is licensed under a Creative Commons Attribution 4.0 International License.