DG-CNN: Introducing Margin Information into Convolutional Neural Networks for Breast Cancer Diagnosis in Ultrasound Images
-
摘要: 研究背景
得益于深度学习的飞速发展,基于深度学习,尤其是卷积神经网络的计算机辅助诊断在过去的几年里取得了巨大的进步。然而,小规模的医学数据集仍然是制约这一领域发展的主要瓶颈。为了解决这个问题,研究人员开始尝试从医学数据集中寻找辅助信息。以前的工作主要通过迁移学习来利用自然图像中的信息。最近的研究工作则尝试引入医学从业者的先验知识,或让网络学习医生如何接受培训,如何阅读图像,或让网络使用医生标注的额外注释等。此类信息的引入极大地促进了网络的辅助诊断性能。
目的
我们尝试发现并使用另一种先验知识,并将其应用于乳腺癌超声图像辅助中。具体包括这种先验知识如何表征,如何融入到卷积神经网络中,并验证融入先验知识之后的诊断性能。
方法
在本文中,我们提出了一种称为Domain Guided-CNN(DG-CNN)的方案在基于超声图像的乳腺癌辅助诊断中融入医学先验信息,本文主要是病灶的边界信息。作为放射科医生在乳腺超声图像中诊断癌症的共识中描述的一个特征,边界信息对于最后的诊断结果起到至关重要的作用。在DG-CNN中,我们首先生成描述肿瘤边界区域的注意力图,然后通过不同的方法将其合并到网络中。具体地,我们首先使用不同的产生方式设计了三种不同的边界注意力图,然后设计了四种引入方法,三种直接融合模式和一种多任务学习模式来融入这一信息。
结果
我们分别在自己的数据集(1485幅超声图像)和公共数据集上测试了DG-CNN的性能。结果表明,DG-CNN可以应用于不同的网络结构,如VGG和ResNet,并可以不同程度上提高它们的诊断性能。其中,在我们的数据集上,基于某种特定的信息引入模式,DG-CNN在ResNet18框架上的乳腺癌诊断准确率提高了2.17%,敏感度提高了1.69%,特异度提高了2.64%,AUC值提高了0.0257。
结论
实验表明,从医学常识中提取的先验知识(边缘信息)有助于提升在超声图像中乳腺癌的诊断性能。据我们所知,这是第一次利用边缘信息来提高深度神经网络在超声图像中诊断乳腺癌的性能。同时,我们也相信,在更多的医学辅助诊断领域,若能有效利用先验知识,也将很大程度上提升其诊断效果。Abstract: Although using convolutional neural networks (CNN) for computer-aided diagnosis (CAD) has made tremendous progress in the last few years, the small medical datasets remain to be the major bottleneck in this area. To address this problem, researchers start looking for information out of the medical datasets. Previous efforts mainly leverage information from natural images via transfer learning. More recent research work focuses on integrating knowledge from medical practitioners, either letting networks resemble how practitioners are trained, how they view images, or using extra annotations. In this paper, we propose a scheme named Domain Guided-CNN (DG-CNN) to incorporate the margin information, a feature described in the consensus for radiologists to diagnose cancer in breast ultrasound (BUS) images. In DG-CNN, attention maps that highlight margin areas of tumors are first generated, and then incorporated via different approaches into the networks. We have tested the performance of DG-CNN on our own dataset (including 1485 ultrasound images) and on a public dataset. The results show that DG-CNN can be applied to different network structures like VGG and ResNet to improve their performance. For example, experimental results on our dataset show that with a certain integrating mode, the improvement of using DG-CNN over a baseline network structure ResNet18 is 2.17% in accuracy, 1.69% in sensitivity, 2.64% in specificity and 2.57% in AUC (Area Under Curve). To the best of our knowledge, this is the first time that the margin information is utilized to improve the performance of deep neural networks in diagnosing breast cancer in BUS images.-
Keywords:
- medical consensus /
- domain knowledge /
- breast cancer diagnosis /
- margin map /
- deep neural network
-
-
[1] Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. arXiv:1409.1556, 2015. https://arxiv.org/abs/1409.1556, Nov. 2021.
[2] He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In Proc. the IEEE Conference on Computer Vision and Pattern Recognition, June 2016, pp.770-778. DOI: 10.1109/CVPR.2016.90.
[3] Shin S Y, Lee S, Yun I D, Lee K M. Joint weakly and semi-supervised deep learning for localization and classification of masses in breast ultrasound images. IEEE Trans. Med. Imaging, 2019, 38(3): 762-774. DOI: 10.1109/TMI.2018.2872031.
[4] Xu X, Lu Q, Yang L, Hu S X, Chen D Z, Hu Y, Shi Y. Quantization of fully convolutional networks for accurate biomedical image segmentation. In Proc. the IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2018, pp.8300-8308. DOI: 10.1109/CVPR.2018.00866.
[5] Zhou Z, Shin J Y, Zhang L, Gurudu S R, Gotway M B, Liang J. Fine-tuning convolutional neural networks for biomedical image analysis: Actively and incrementally. In Proc. the IEEE Conference on Computer Vision and Pattern Recognition, July 2017, pp.4761-4772. DOI: 10.1109/CVPR.2017.506.
[6] Esteva A, Kuprel B, Novoa R A, Ko J M, Swetter S M, Blau H M, Thrun S. Dermatologist-level classification of skin cancer with deep neural networks. Nature, 2017, 542(7639): 115-118. DOI: 10.1038/nature21056.
[7] Huynh B, Drukker K, Giger M. MO-DE-207B-06: Computer-aided diagnosis of breast ultrasound images using transfer learning from deep convolutional neural networks. Med. Phys., 2016, 43(6): 3705-3705. DOI: 10.1118/1.4957255.
[8] Yap M H, Pons G, Marti J, Ganau S, Sentis M, Zwiggelaar R, Davison A K, Marti R. Automated breast ultrasound lesions detection using convolutional neural networks. IEEE J. Biomed. Health Inform., 2018, 22(4): 1218-1226. DOI: 10.1109/JBHI.2017.2731873.
[9] Tajbakhsh N, Shin J Y, Gurudu S R, Hurst R T, Kendall C B, Gotway M B, Liang J. Convolutional neural networks for medical image analysis: Full training or fine tuning? IEEE Trans. Med. Imaging, 2016, 35(5): 1299-1312. DOI: 10.1109/TMI.2016.2535302.
[10] Guan Q, Huang Y, Zhong Z, Zheng Z, Zheng L, Yang Y. Diagnose like a radiologist: Attention guided convolutional neural network for thorax disease classification. arXiv:1801.09927, 2018. https://arxiv.org/abs/1801.09927, Nov. 2021.
[11] González-Díaz I. DermaKNet: Incorporating the knowledge of dermatologists to convolutional neural networks for skin lesion diagnosis. IEEE J. Biomed. Health Inform., 2018, 23(2): 547-559. DOI: 10.1109/JBHI.2018.2806962.
[12] Li L, Xu M, Wang X, Jiang L, Liu H. Attention based glaucoma detection: A large-scale database and CNN model. In Proc. the IEEE Conference on Computer Vision and Pattern Recognition, June 2019, pp.10571-10580. DOI: 10.1109/CVPR.2019.01082.
[13] Fang L, Wang C, Li S, Rabbani H, Chen X, Liu Z. Attention to lesion: Lesion-aware convolutional neural network for retinal optical coherence tomography image classification. IEEE Trans. Med. Imaging, 2019, 38(8): 1959-1970. DOI: 10.1109/TMI.2019.2898414.
[14] Mitsuhara M, Fukui H, Sakashita Y, Ogata T, Hirakawa T, Yamashita T, Fujiyoshi H. Embedding human knowledge in deep neural network via attention map. arXiv:1905.03540, 2019. https://arxiv.org/abs/1905.03540, May 2021.
[15] Dorsi C, Bassett L, Feisg S, Lee C I, Lehman C D, Bassett L W. Breast Imaging Reporting and Data System (BI-RADS). Oxford University Press, 2018.
[16] Bian C, Lee R, Chou Y, Cheng J. Boundary regularized convolutional neural network for layer parsing of breast anatomy in automated whole breast ultrasound. In Proc. the 20th International Conference on Medical Image Computing and Computer-Assisted Intervention, Sept. 2017, pp.259-266. DOI: 10.1007/978-3-319-66179-7\textunderscore 30.
[17] Maicas G, Bradley A P, Nascimento J C, Reid I, Carneiro G. Training medical image analysis systems like radiologists. In Proc. the 21st International Conference on Medical Image Computing and Computer-Assisted Intervention, September 2018, pp.546-554. DOI: 10.1007/978-3-030-00928-1\textunderscore 62.
[18] Liu J, Li W, Zhao N, Cao K, Yin Y, Song Q, Chen H, Gong X. Integrate domain knowledge in training CNN for ultrasonography breast cancer diagnosis. In Proc. the 21st International Conference on Medical Image Computing and Computer-Assisted Intervention, September 2018, pp.868-875. DOI: 10.1007/978-3-030-00934-2\textunderscore 96.
[19] Wang X, Peng Y, Lu Y, Lu Z, Summers R M. TieNet: Text-image embedding network for common thorax disease classification and reporting in chest X-rays. In Proc. the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, June 2018, pp.9049-9058. DOI: 10.1109/CVPR.2018.00943.
[20] Berg W A, Cosgrove D O, Doré C J et al. Shear-wave elastography improves the specificity of breast US: the BE1 multinational study of 939 masses. Radiology, 2012, 262(2): 435-449. DOI: 10.1148/radiol.11110640.
[21] Dobruch-Sobczak K, Piotrzkowska-Wróblewska H, Roszkowska-Purska K, Nowicki A, Jakubowsi W. Usefulness of combined BI-RADS analysis and Nakagami statistics of ultrasound echoes in the diagnosis of breast lesions. Clin. Radiol., 2017, 72(4): 339-339. DOI: 10.1016/j.crad.2016.11.009.
[22] Liu Y, Cheng M M, Hu X, Wang K, Bai X. Richer convolutional features for edge detection. In Proc. the IEEE Conference on Computer Vision and Pattern Recognition, July 2017, pp.5872-5881. DOI: 10.1109/CVPR.2017.622.
[23] Arbeláez P, Maire M, Fowlkes C, Malik J. Contour detection and hierarchical image segmentation. IEEE Trans. Pattern Anal. Mach. Intell., 2011, 33(5): 898-916. DOI: 10.1109/TPAMI.2010.161.
[24] Lin M, Chen Q, Yan S. Network in network. arXiv:1312.4400, 2013. https://arxiv.org/abs/1312.4400, March 2021.
[25] Zhou B, Khosla A, Lapedriza A, Oliva A, Torralba A. Learning deep features for discriminative localization. In Proc. the IEEE Conference on Computer Vision and Pattern Recognition, June 2016, pp.2921-2929. DOI: 10.1109/CVPR.2016.319.
[26] Shelhamer E, Long J, Darrell T. Fully convolutional networks for semantic segmentation. IEEE Trans. Pattern Anal. Mach. Intell., 2017, 39(4): 640-651. DOI: 10.1109/TPAMI.2016.2572683.
[27] He K, Gkioxari G, Dollar P, Girshick R. Mask R-CNN. In Proc. the IEEE International Conference on Computer Vision, October 2017, pp.2980-2988. DOI: 10.1109/ICCV.2017.322.
[28] Han S, Kang H K, Jeong J Y et al. A deep learning framework for supporting the classification of breast lesions in ultrasound images. Phys. Med. Biol., 2017, 62(19): 7714-7728. DOI: 10.1088/1361-6560/aa82ec.
[29] Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell., 2017, 39(6): 1137-1149. DOI: 10.1109/TPAMI.2016.2577031.
-
期刊类型引用(11)
1. Jun Yang, Yi-Qiang Sheng, Jin-Lin Wang, et al. CAGCN: Centrality-Aware Graph Convolution Network for Anomaly Detection in Industrial Control Systems. Journal of Computer Science and Technology, 2024, 39(4): 967. 必应学术
2. R. Karthiga, K. Narasimhan, Thanikaiselvan V, et al. Review of AI & XAI-based breast cancer diagnosis methods using various imaging modalities. Multimedia Tools and Applications, 2024. 必应学术
3. Sudharsana Vijayan, Radhika Panneerselvam, Thundi Valappil Roshini. Hybrid machine learning‐based breast cancer segmentation framework using ultrasound images with optimal weighted features. Cell Biochemistry and Function, 2024, 42(4) 必应学术
4. Xiaozheng Xie, Jianwei Niu, Xuefeng Liu, et al. A domain knowledge powered hybrid regularization strategy for semi-supervised breast cancer diagnosis. Expert Systems with Applications, 2024, 243: 122897. 必应学术
5. Han Xue, Huimin Lu, Yilong Wang, et al. MCE: Medical Cognition Embedded in 3D MRI feature extraction for advancing glioma staging. PLOS ONE, 2024, 19(5): e0304419. 必应学术
6. Ziquan Zhu, Shui-Hua Wang, Yu-Dong Zhang. A Survey of Convolutional Neural Network in Breast Cancer. Computer Modeling in Engineering & Sciences, 2023, 136(3): 2127. 必应学术
7. Jiawei Sun, Bobo Wu, Tong Zhao, et al. Classification for thyroid nodule using ViT with contrastive learning in ultrasound images. Computers in Biology and Medicine, 2023, 152: 106444. 必应学术
8. Tong Li, Hong-Lan Jiang, Hai Mo, et al. Approximate Processing Element Design and Analysis for the Implementation of CNN Accelerators. Journal of Computer Science and Technology, 2023, 38(2): 309. 必应学术
9. Marwa Obayya, Siwar Ben Haj Hassine, Sana Alazwari, et al. Aquila Optimizer with Bayesian Neural Network for Breast Cancer Detection on Ultrasound Images. Applied Sciences, 2022, 12(17): 8679. 必应学术
10. Mingzhe Liu, Chuanjun Zhao. Recent trend analysis of convolutional neural network-based breast cancer diagnosis. International Conference on Mechatronics Engineering and Artificial Intelligence (MEAI 2022), 必应学术
11. Jingyi Liu, Jiawei Qi, Kun Wang, et al. Risk assessment method of power marketing operation based on convolutional neural network. 2023 3rd International Conference on Consumer Electronics and Computer Engineering (ICCECE), 必应学术
其他类型引用(0)
-
其他相关附件
-
本文英文pdf
2022-2-1-0192-Highlights 点击下载(132KB) -
本文附件外链
https://rdcu.be/cRduE
-
计量
- 文章访问数: 206
- HTML全文浏览量: 11
- PDF下载量: 1
- 被引次数: 11