12 research outputs found

    LightBTSeg: A lightweight breast tumor segmentation model using ultrasound images via dual-path joint knowledge distillation

    Full text link
    The accurate segmentation of breast tumors is an important prerequisite for lesion detection, which has significant clinical value for breast tumor research. The mainstream deep learning-based methods have achieved a breakthrough. However, these high-performance segmentation methods are formidable to implement in clinical scenarios since they always embrace high computation complexity, massive parameters, slow inference speed, and huge memory consumption. To tackle this problem, we propose LightBTSeg, a dual-path joint knowledge distillation framework, for lightweight breast tumor segmentation. Concretely, we design a double-teacher model to represent the fine-grained feature of breast ultrasound according to different semantic feature realignments of benign and malignant breast tumors. Specifically, we leverage the bottleneck architecture to reconstruct the original Attention U-Net. It is regarded as a lightweight student model named Simplified U-Net. Then, the prior knowledge of benign and malignant categories is utilized to design the teacher network combined dual-path joint knowledge distillation, which distills the knowledge from cumbersome benign and malignant teachers to a lightweight student model. Extensive experiments conducted on breast ultrasound images (Dataset BUSI) and Breast Ultrasound Dataset B (Dataset B) datasets demonstrate that LightBTSeg outperforms various counterparts.Comment: 7 pages, 7 figures, conferenc

    Gradually Applying Weakly Supervised and Active Learning for Mass Detection in Breast Ultrasound Images

    Full text link
    We propose a method for effectively utilizing weakly annotated image data in an object detection tasks of breast ultrasound images. Given the problem setting where a small, strongly annotated dataset and a large, weakly annotated dataset with no bounding box information are available, training an object detection model becomes a non-trivial problem. We suggest a controlled weight for handling the effect of weakly annotated images in a two stage object detection model. We~also present a subsequent active learning scheme for safely assigning weakly annotated images a strong annotation using the trained model. Experimental results showed a 24\% point increase in correct localization (CorLoc) measure, which is the ratio of correctly localized and classified images, by assigning the properly controlled weight. Performing active learning after a model is trained showed an additional increase in CorLoc. We tested the proposed method on the Stanford Dog datasets to assure that it can be applied to general cases, where strong annotations are insufficient to obtain resembling results. The presented method showed that higher performance is achievable with lesser annotation effort

    AAU-net: An Adaptive Attention U-net for Breast Lesions Segmentation in Ultrasound Images

    Full text link
    Various deep learning methods have been proposed to segment breast lesion from ultrasound images. However, similar intensity distributions, variable tumor morphology and blurred boundaries present challenges for breast lesions segmentation, especially for malignant tumors with irregular shapes. Considering the complexity of ultrasound images, we develop an adaptive attention U-net (AAU-net) to segment breast lesions automatically and stably from ultrasound images. Specifically, we introduce a hybrid adaptive attention module, which mainly consists of a channel self-attention block and a spatial self-attention block, to replace the traditional convolution operation. Compared with the conventional convolution operation, the design of the hybrid adaptive attention module can help us capture more features under different receptive fields. Different from existing attention mechanisms, the hybrid adaptive attention module can guide the network to adaptively select more robust representation in channel and space dimensions to cope with more complex breast lesions segmentation. Extensive experiments with several state-of-the-art deep learning segmentation methods on three public breast ultrasound datasets show that our method has better performance on breast lesion segmentation. Furthermore, robustness analysis and external experiments demonstrate that our proposed AAU-net has better generalization performance on the segmentation of breast lesions. Moreover, the hybrid adaptive attention module can be flexibly applied to existing network frameworks.Comment: Breast cancer segmentation, Ultrasound images, Hybrid attention, Adaptive learning, Deep learnin

    FSS-2019-nCov:A deep learning architecture for semi-supervised few-shot segmentation of COVID-19 infection

    Get PDF
    The newly discovered coronavirus (COVID-19) pneumonia is providing major challenges to research in terms of diagnosis and disease quantification. Deep-learning (DL) techniques allow extremely precise image segmentation; yet, they necessitate huge volumes of manually labeled data to be trained in a supervised manner. Few-Shot Learning (FSL) paradigms tackle this issue by learning a novel category from a small number of annotated instances. We present an innovative semi-supervised few-shot segmentation (FSS) approach for efficient segmentation of 2019-nCov infection (FSS-2019-nCov) from only a few amounts of annotated lung CT scans. The key challenge of this study is to provide accurate segmentation of COVID-19 infection from a limited number of annotated instances. For that purpose, we propose a novel dual-path deep-learning architecture for FSS. Every path contains encoder–decoder (E-D) architecture to extract high-level information while maintaining the channel information of COVID-19 CT slices. The E-D architecture primarily consists of three main modules: a feature encoder module, a context enrichment (CE) module, and a feature decoder module. We utilize the pre-trained ResNet34 as an encoder backbone for feature extraction. The CE module is designated by a newly introduced proposed Smoothed Atrous Convolution (SAC) block and Multi-scale Pyramid Pooling (MPP) block. The conditioner path takes the pairs of CT images and their labels as input and produces a relevant knowledge representation that is transferred to the segmentation path to be used to segment the new images. To enable effective collaboration between both paths, we propose an adaptive recombination and recalibration (RR) module that permits intensive knowledge exchange between paths with a trivial increase in computational complexity. The model is extended to multi-class labeling for various types of lung infections. This contribution overcomes the limitation of the lack of large numbers of COVID-19 CT scans. It also provides a general framework for lung disease diagnosis in limited data situations

    Semi-supervised Learning for Real-time Segmentation of Ultrasound Video Objects: A Review

    Get PDF
    Real-time intelligent segmentation of ultrasound video object is a demanding task in the field of medical image processing and serves as an essential and critical step in image-guided clinical procedures. However, obtaining reliable and accurate medical image annotations often necessitates expert guidance, making the acquisition of large-scale annotated datasets challenging and costly. This presents obstacles for traditional supervised learning methods. Consequently, semi-supervised learning (SSL) has emerged as a promising solution, capable of utilizing unlabeled data to enhance model performance and has been widely adopted in medical image segmentation tasks. However, striking a balance between segmentation accuracy and inference speed remains a challenge for real-time segmentation. This paper provides a comprehensive review of research progress in real-time intelligent semi-supervised ultrasound video object segmentation (SUVOS) and offers insights into future developments in this area

    AAU-Net: an Adaptive Attention U-Net for breast lesions segmentation in ultrasound images

    Get PDF
    Various deep learning methods have been proposed to segment breast lesions from ultrasound images. However, similar intensity distributions, variable tumor morphologies and blurred boundaries present challenges for breast lesions segmentation, especially for malignant tumors with irregular shapes. Considering the complexity of ultrasound images, we develop an adaptive attention U-net (AAU-net) to segment breast lesions automatically and stably from ultrasound images. Specifically, we introduce a hybrid adaptive attention module (HAAM), which mainly consists of a channel self-attention block and a spatial self-attention block, to replace the traditional convolution operation. Compared with the conventional convolution operation, the design of the hybrid adaptive attention module can help us capture more features under different receptive fields. Different from existing attention mechanisms, the HAAM module can guide the network to adaptively select more robust representation in channel and space dimensions to cope with more complex breast lesions segmentation. Extensive experiments with several state-of-the-art deep learning segmentation methods on three public breast ultrasound datasets show that our method has better performance on breast lesions segmentation. Furthermore, robustness analysis and external experiments demonstrate that our proposed AAU-net has better generalization performance in the breast lesion segmentation. Moreover, the HAAM module can be flexibly applied to existing network frameworks. The source code is available on https://github.com/CGPxy/AAU-net

    Deep-Learning-Based Computer- Aided Systems for Breast Cancer Imaging: A Critical Review

    Full text link
    [EN] This paper provides a critical review of the literature on deep learning applications in breast tumor diagnosis using ultrasound and mammography images. It also summarizes recent advances in computer-aided diagnosis/detection (CAD) systems, which make use of new deep learning methods to automatically recognize breast images and improve the accuracy of diagnoses made by radiologists. This review is based upon published literature in the past decade (January 2010-January 2020), where we obtained around 250 research articles, and after an eligibility process, 59 articles were presented in more detail. The main findings in the classification process revealed that new DL-CAD methods are useful and effective screening tools for breast cancer, thus reducing the need for manual feature extraction. The breast tumor research community can utilize this survey as a basis for their current and future studies.This project has been co-financed by the Spanish Government Grant PID2019-107790RB-C22, "Software development for a continuous PET crystal systems applied to breast cancer".Jiménez-Gaona, Y.; Rodríguez Álvarez, MJ.; Lakshminarayanan, V. (2020). Deep-Learning-Based Computer- Aided Systems for Breast Cancer Imaging: A Critical Review. Applied Sciences. 10(22):1-29. https://doi.org/10.3390/app10228298S1291022Jemal, A., Bray, F., Center, M. M., Ferlay, J., Ward, E., & Forman, D. (2011). Global cancer statistics. CA: A Cancer Journal for Clinicians, 61(2), 69-90. doi:10.3322/caac.20107Gao, F., Chia, K.-S., Ng, F.-C., Ng, E.-H., & Machin, D. (2002). Interval cancers following breast cancer screening in Singaporean women. International Journal of Cancer, 101(5), 475-479. doi:10.1002/ijc.10636Munir, K., Elahi, H., Ayub, A., Frezza, F., & Rizzi, A. (2019). Cancer Diagnosis Using Deep Learning: A Bibliographic Review. Cancers, 11(9), 1235. doi:10.3390/cancers11091235Nahid, A.-A., & Kong, Y. (2017). Involvement of Machine Learning for Breast Cancer Image Classification: A Survey. Computational and Mathematical Methods in Medicine, 2017, 1-29. doi:10.1155/2017/3781951Ramadan, S. Z. (2020). Methods Used in Computer-Aided Diagnosis for Breast Cancer Detection Using Mammograms: A Review. Journal of Healthcare Engineering, 2020, 1-21. doi:10.1155/2020/9162464CHAN, H.-P., DOI, K., VYBRONY, C. J., SCHMIDT, R. A., METZ, C. E., LAM, K. L., … MACMAHON, H. (1990). Improvement in Radiologists?? Detection of Clustered Microcalcifications on Mammograms. Investigative Radiology, 25(10), 1102-1110. doi:10.1097/00004424-199010000-00006Olsen, O., & Gøtzsche, P. C. (2001). Cochrane review on screening for breast cancer with mammography. The Lancet, 358(9290), 1340-1342. doi:10.1016/s0140-6736(01)06449-2Mann, R. M., Kuhl, C. K., Kinkel, K., & Boetes, C. (2008). Breast MRI: guidelines from the European Society of Breast Imaging. European Radiology, 18(7), 1307-1318. doi:10.1007/s00330-008-0863-7Jalalian, A., Mashohor, S. B. T., Mahmud, H. R., Saripan, M. I. B., Ramli, A. R. B., & Karasfi, B. (2013). Computer-aided detection/diagnosis of breast cancer in mammography and ultrasound: a review. Clinical Imaging, 37(3), 420-426. doi:10.1016/j.clinimag.2012.09.024Sarno, A., Mettivier, G., & Russo, P. (2015). Dedicated breast computed tomography: Basic aspects. Medical Physics, 42(6Part1), 2786-2804. doi:10.1118/1.4919441Njor, S., Nyström, L., Moss, S., Paci, E., Broeders, M., Segnan, N., & Lynge, E. (2012). Breast Cancer Mortality in Mammographic Screening in Europe: A Review of Incidence-Based Mortality Studies. Journal of Medical Screening, 19(1_suppl), 33-41. doi:10.1258/jms.2012.012080Morrell, S., Taylor, R., Roder, D., & Dobson, A. (2012). Mammography screening and breast cancer mortality in Australia: an aggregate cohort study. Journal of Medical Screening, 19(1), 26-34. doi:10.1258/jms.2012.011127Marmot, M. G., Altman, D. G., Cameron, D. A., Dewar, J. A., Thompson, S. G., & Wilcox, M. (2013). The benefits and harms of breast cancer screening: an independent review. British Journal of Cancer, 108(11), 2205-2240. doi:10.1038/bjc.2013.177Pisano, E. D., Gatsonis, C., Hendrick, E., Yaffe, M., Baum, J. K., Acharyya, S., … Rebner, M. (2005). Diagnostic Performance of Digital versus Film Mammography for Breast-Cancer Screening. New England Journal of Medicine, 353(17), 1773-1783. doi:10.1056/nejmoa052911Carney, P. A., Miglioretti, D. L., Yankaskas, B. C., Kerlikowske, K., Rosenberg, R., Rutter, C. M., … Ballard-Barbash, R. (2003). Individual and Combined Effects of Age, Breast Density, and Hormone Replacement Therapy Use on the Accuracy of Screening Mammography. Annals of Internal Medicine, 138(3), 168. doi:10.7326/0003-4819-138-3-200302040-00008Woodard, D. B., Gelfand, A. E., Barlow, W. E., & Elmore, J. G. (2007). Performance assessment for radiologists interpreting screening mammography. Statistics in Medicine, 26(7), 1532-1551. doi:10.1002/sim.2633Cole, E. B., Pisano, E. D., Kistner, E. O., Muller, K. E., Brown, M. E., Feig, S. A., … Braeuning, M. P. (2003). Diagnostic Accuracy of Digital Mammography in Patients with Dense Breasts Who Underwent Problem-solving Mammography: Effects of Image Processing and Lesion Type. Radiology, 226(1), 153-160. doi:10.1148/radiol.2261012024Boyd, N. F., Guo, H., Martin, L. J., Sun, L., Stone, J., Fishell, E., … Yaffe, M. J. (2007). Mammographic Density and the Risk and Detection of Breast Cancer. New England Journal of Medicine, 356(3), 227-236. doi:10.1056/nejmoa062790Bird, R. E., Wallace, T. W., & Yankaskas, B. C. (1992). Analysis of cancers missed at screening mammography. Radiology, 184(3), 613-617. doi:10.1148/radiology.184.3.1509041Kerlikowske, K. (2000). Performance of Screening Mammography among Women with and without a First-Degree Relative with Breast Cancer. Annals of Internal Medicine, 133(11), 855. doi:10.7326/0003-4819-133-11-200012050-00009Nunes, F. L. S., Schiabel, H., & Goes, C. E. (2006). Contrast Enhancement in Dense Breast Images to Aid Clustered Microcalcifications Detection. Journal of Digital Imaging, 20(1), 53-66. doi:10.1007/s10278-005-6976-5Dinnes, J., Moss, S., Melia, J., Blanks, R., Song, F., & Kleijnen, J. (2001). Effectiveness and cost-effectiveness of double reading of mammograms in breast cancer screening: findings of a systematic review. The Breast, 10(6), 455-463. doi:10.1054/brst.2001.0350Robinson, P. J. (1997). Radiology’s Achilles’ heel: error and variation in the interpretation of the Röntgen image. The British Journal of Radiology, 70(839), 1085-1098. doi:10.1259/bjr.70.839.9536897Rangayyan, R. M., Ayres, F. J., & Leo Desautels, J. E. (2007). A review of computer-aided diagnosis of breast cancer: Toward the detection of subtle signs. Journal of the Franklin Institute, 344(3-4), 312-348. doi:10.1016/j.jfranklin.2006.09.003Vyborny, C. J., Giger, M. L., & Nishikawa, R. M. (2000). COMPUTER-AIDED DETECTION AND DIAGNOSIS OF BREAST CANCER. Radiologic Clinics of North America, 38(4), 725-740. doi:10.1016/s0033-8389(05)70197-4Giger, M. L. (2018). Machine Learning in Medical Imaging. Journal of the American College of Radiology, 15(3), 512-520. doi:10.1016/j.jacr.2017.12.028Xu, Y., Wang, Y., Yuan, J., Cheng, Q., Wang, X., & Carson, P. L. (2019). Medical breast ultrasound image segmentation by machine learning. Ultrasonics, 91, 1-9. doi:10.1016/j.ultras.2018.07.006Shan, J., Alam, S. K., Garra, B., Zhang, Y., & Ahmed, T. (2016). Computer-Aided Diagnosis for Breast Ultrasound Using Computerized BI-RADS Features and Machine Learning Methods. Ultrasound in Medicine & Biology, 42(4), 980-988. doi:10.1016/j.ultrasmedbio.2015.11.016Zhang, Q., Xiao, Y., Dai, W., Suo, J., Wang, C., Shi, J., & Zheng, H. (2016). Deep learning based classification of breast tumors with shear-wave elastography. Ultrasonics, 72, 150-157. doi:10.1016/j.ultras.2016.08.004Cheng, J.-Z., Ni, D., Chou, Y.-H., Qin, J., Tiu, C.-M., Chang, Y.-C., … Chen, C.-M. (2016). Computer-Aided Diagnosis with Deep Learning Architecture: Applications to Breast Lesions in US Images and Pulmonary Nodules in CT Scans. Scientific Reports, 6(1). doi:10.1038/srep24454Shin, S. Y., Lee, S., Yun, I. D., Kim, S. M., & Lee, K. M. (2019). Joint Weakly and Semi-Supervised Deep Learning for Localization and Classification of Masses in Breast Ultrasound Images. IEEE Transactions on Medical Imaging, 38(3), 762-774. doi:10.1109/tmi.2018.2872031Wang, J., Ding, H., Bidgoli, F. A., Zhou, B., Iribarren, C., Molloi, S., & Baldi, P. (2017). Detecting Cardiovascular Disease from Mammograms With Deep Learning. IEEE Transactions on Medical Imaging, 36(5), 1172-1181. doi:10.1109/tmi.2017.2655486Kooi, T., Litjens, G., van Ginneken, B., Gubern-Mérida, A., Sánchez, C. I., Mann, R., … Karssemeijer, N. (2017). Large scale deep learning for computer aided detection of mammographic lesions. Medical Image Analysis, 35, 303-312. doi:10.1016/j.media.2016.07.007Debelee, T. G., Schwenker, F., Ibenthal, A., & Yohannes, D. (2019). Survey of deep learning in breast cancer image analysis. Evolving Systems, 11(1), 143-163. doi:10.1007/s12530-019-09297-2Keen, J. D., Keen, J. M., & Keen, J. E. (2018). Utilization of Computer-Aided Detection for Digital Screening Mammography in the United States, 2008 to 2016. Journal of the American College of Radiology, 15(1), 44-48. doi:10.1016/j.jacr.2017.08.033Henriksen, E. L., Carlsen, J. F., Vejborg, I. M., Nielsen, M. B., & Lauridsen, C. A. (2018). The efficacy of using computer-aided detection (CAD) for detection of breast cancer in mammography screening: a systematic review. Acta Radiologica, 60(1), 13-18. doi:10.1177/0284185118770917Gao, Y., Geras, K. J., Lewin, A. A., & Moy, L. (2019). New Frontiers: An Update on Computer-Aided Diagnosis for Breast Imaging in the Age of Artificial Intelligence. American Journal of Roentgenology, 212(2), 300-307. doi:10.2214/ajr.18.20392Pacilè, S., Lopez, J., Chone, P., Bertinotti, T., Grouin, J. M., & Fillard, P. (2020). Improving Breast Cancer Detection Accuracy of Mammography with the Concurrent Use of an Artificial Intelligence Tool. Radiology: Artificial Intelligence, 2(6), e190208. doi:10.1148/ryai.2020190208Huynh, B. Q., Li, H., & Giger, M. L. (2016). Digital mammographic tumor classification using transfer learning from deep convolutional neural networks. Journal of Medical Imaging, 3(3), 034501. doi:10.1117/1.jmi.3.3.034501Yap, M. H., Pons, G., Marti, J., Ganau, S., Sentis, M., Zwiggelaar, R., … Marti, R. (2018). Automated Breast Ultrasound Lesions Detection Using Convolutional Neural Networks. IEEE Journal of Biomedical and Health Informatics, 22(4), 1218-1226. doi:10.1109/jbhi.2017.2731873Moon, W. K., Lee, Y.-W., Ke, H.-H., Lee, S. H., Huang, C.-S., & Chang, R.-F. (2020). Computer‐aided diagnosis of breast ultrasound images using ensemble learning from convolutional neural networks. Computer Methods and Programs in Biomedicine, 190, 105361. doi:10.1016/j.cmpb.2020.105361LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436-444. doi:10.1038/nature14539Miotto, R., Wang, F., Wang, S., Jiang, X., & Dudley, J. T. (2017). Deep learning for healthcare: review, opportunities and challenges. Briefings in Bioinformatics, 19(6), 1236-1246. doi:10.1093/bib/bbx044Shin, H.-C., Roth, H. R., Gao, M., Lu, L., Xu, Z., Nogues, I., … Summers, R. M. (2016). Deep Convolutional Neural Networks for Computer-Aided Detection: CNN Architectures, Dataset Characteristics and Transfer Learning. IEEE Transactions on Medical Imaging, 35(5), 1285-1298. doi:10.1109/tmi.2016.2528162Lee, J.-G., Jun, S., Cho, Y.-W., Lee, H., Kim, G. B., Seo, J. B., & Kim, N. (2017). Deep Learning in Medical Imaging: General Overview. Korean Journal of Radiology, 18(4), 570. doi:10.3348/kjr.2017.18.4.570Suzuki, K. (2017). Overview of deep learning in medical imaging. Radiological Physics and Technology, 10(3), 257-273. doi:10.1007/s12194-017-0406-5Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2010). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. International Journal of Surgery, 8(5), 336-341. doi:10.1016/j.ijsu.2010.02.007Khan, K. S., Kunz, R., Kleijnen, J., & Antes, G. (2003). Five Steps to Conducting a Systematic Review. Journal of the Royal Society of Medicine, 96(3), 118-121. doi:10.1177/014107680309600304Han, S., Kang, H.-K., Jeong, J.-Y., Park, M.-H., Kim, W., Bang, W.-C., & Seong, Y.-K. (2017). A deep learning framework for supporting the classification of breast lesions in ultrasound images. Physics in Medicine & Biology, 62(19), 7714-7728. doi:10.1088/1361-6560/aa82ecMoreira, I. C., Amaral, I., Domingues, I., Cardoso, A., Cardoso, M. J., & Cardoso, J. S. (2012). INbreast. Academic Radiology, 19(2), 236-248. doi:10.1016/j.acra.2011.09.014Abdelhafiz, D., Yang, C., Ammar, R., & Nabavi, S. (2019). Deep convolutional neural networks for mammography: advances, challenges and applications. BMC Bioinformatics, 20(S11). doi:10.1186/s12859-019-2823-4Byra, M., Jarosik, P., Szubert, A., Galperin, M., Ojeda-Fournier, H., Olson, L., … Andre, M. (2020). Breast mass segmentation in ultrasound with selective kernel U-Net convolutional neural network. Biomedical Signal Processing and Control, 61, 102027. doi:10.1016/j.bspc.2020.102027Jiao, Z., Gao, X., Wang, Y., & Li, J. (2016). A deep feature based framework for breast masses classification. Neurocomputing, 197, 221-231. doi:10.1016/j.neucom.2016.02.060Arevalo, J., González, F. A., Ramos-Pollán, R., Oliveira, J. L., & Guevara Lopez, M. A. (2016). Representation learning for mammography mass lesion classification with convolutional neural networks. Computer Methods and Programs in Biomedicine, 127, 248-257. doi:10.1016/j.cmpb.2015.12.014Peng, W., Mayorga, R. V., & Hussein, E. M. A. (2016). An automated confirmatory system for analysis of mammograms. Computer Methods and Programs in Biomedicine, 125, 134-144. doi:10.1016/j.cmpb.2015.09.019Al-Dhabyani, W., Gomaa, M., Khaled, H., & Fahmy, A. (2020). Dataset of breast ultrasound images. Data in Brief, 28, 104863. doi:10.1016/j.dib.2019.104863Piotrzkowska-Wróblewska, H., Dobruch-Sobczak, K., Byra, M., & Nowicki, A. (2017). Open access database of raw ultrasonic signals acquired from malignant and benign breast lesions. Medical Physics, 44(11), 6105-6109. doi:10.1002/mp.12538Fujita, H. (2020). AI-based computer-aided diagnosis (AI-CAD): the latest review to read first. Radiological Physics and Technology, 13(1), 6-19. doi:10.1007/s12194-019-00552-4Sengupta, S., Singh, A., Leopold, H. A., Gulati, T., & Lakshminarayanan, V. (2020). Ophthalmic diagnosis using deep learning with fundus images – A critical review. Artificial Intelligence in Medicine, 102, 101758. doi:10.1016/j.artmed.2019.101758Ganesan, K., Acharya, U. R., Chua, K. C., Min, L. C., & Abraham, K. T. (2013). Pectoral muscle segmentation: A review. Computer Methods and Programs in Biomedicine, 110(1), 48-57. doi:10.1016/j.cmpb.2012.10.020Huang, Q., Luo, Y., & Zhang, Q. (2017). Breast ultrasound image segmentation: a survey. International Journal of Computer Assisted Radiology and Surgery, 12(3), 493-507. doi:10.1007/s11548-016-1513-1Noble, J. A., & Boukerroui, D. (2006). Ultrasound image segmentation: a survey. IEEE Transactions on Medical Imaging, 25(8), 987-1010. doi:10.1109/tmi.2006.877092Kallergi, M., Woods, K., Clarke, L. P., Qian, W., & Clark, R. A. (1992). Image segmentation in digital mammography: Comparison of local thresholding and region growing algorithms. Computerized Medical Imaging and Graphics, 16(5), 323-331. doi:10.1016/0895-6111(92)90145-yTsantis, S., Dimitropoulos, N., Cavouras, D., & Nikiforidis, G. (2006). A hybrid multi-scale model for thyroid nodule boundary detection on ultrasound images. Computer Methods and Programs in Biomedicine, 84(2-3), 86-98. doi:10.1016/j.cmpb.2006.09.006Ilesanmi, A. E., Idowu, O. P., & Makhanov, S. S. (2020). Multiscale superpixel method for segmentation of breast ultrasound. Computers in Biology and Medicine, 125, 103879. doi:10.1016/j.compbiomed.2020.103879Chen, D.-R., Chang, R.-F., Kuo, W.-J., Chen, M.-C., & Huang, Y. .-L. (2002). Diagnosis of breast tumors with sonographic texture analysis using wavelet transform and neural networks. Ultrasound in Medicine & Biology, 28(10), 1301-1310. doi:10.1016/s0301-5629(02)00620-8Cheng, H. D., Shan, J., Ju, W., Guo, Y., & Zhang, L. (2010). Automated breast cancer detection and classification using ultrasound images: A survey. Pattern Recognition, 43(1), 299-317. doi:10.1016/j.patcog.2009.05.012Chan, H.-P., Wei, D., Helvie, M. A., Sahiner, B., Adler, D. D., Goodsitt, M. M., & Petrick, N. (1995). Computer-aided classification of mammographic masses and normal tissue: linear discriminant analysis in texture feature space. Physics in Medicine and Biology, 40(5), 857-876. doi:10.1088/0031-9155/40/5/010Tanaka, T., Torii, S., Kabuta, I., Shimizu, K., & Tanaka, M. (2007). Pattern Classification of Nevus with Texture Analysis. IEEJ Transactions on Electrical and Electronic Engineering, 3(1), 143-150. doi:10.1002/tee.20246Singh, B., Jain, V. K., & Singh, S. (2014). Mammogram Mass Classification Using Support Vector Machine with Texture, Shape Features and Hierarchical Centroid Method. Journal of Medical Imaging and Health Informatics, 4(5), 687-696. doi:10.1166/jmihi.2014.1312Pal, N. R., Bhowmick, B., Patel, S. K., Pal, S., & Das, J. (2008). A multi-stage neural network aided system for detection of microcalcifications in digitized mammograms. Neurocomputing, 71(13-15), 2625-2634. doi:10.1016/j.neucom.2007.06.015Ayer, T., Chen, Q., & Burnside, E. S. (2013). Artificial Neural Networks in Mammography Interpretation and Diagnostic Decision Making. Computational and Mathematical Methods in Medicine, 2013, 1-10. doi:10.1155/2013/832509Sumbaly, R., Vishnusri, N., & Jeyalatha, S. (2014). Diagnosis of Breast Cancer using Decision Tree Data Mining Technique. International Journal of Computer Applications, 98(10), 16-24. doi:10.5120/17219-7456Landwehr, N., Hall, M., & Frank, E. (2005). Logistic Model Trees. Machine Learning, 59(1-2), 161-205. doi:10.1007/s10994-005-0466-3Abdel-Zaher, A. M., & Eldeib, A. M. (2016). Breast cancer classification using deep belief networks. Expert Systems with Applications, 46, 139-144. doi:10.1016/j.eswa.2015.10.015Nishikawa, R. M., Giger, M. L., Doi, K., Metz, C. E., Yin, F.-F., Vyborny, C. J., & Schmidt, R. A. (1994). Effect of case selection on the performance of computer-aided detection schemes. Medical Physics, 21(2), 265-269. doi:10.1118/1.597287Guo, R., Lu, G., Qin, B., & Fei, B. (2018). Ultrasound Imaging Technologies for Breast Cancer Detection and Management: A Review. Ultrasound in Medicine & Biology, 44(1), 37-70. doi:10.1016/j.ultrasmedbio.2017.09.012Kang, C.-C., Wang, W.-J., & Kang, C.-H. (2012). Image segmentation with complicated background by using seeded region growing. AEU - International Journal of Electronics and Communications, 66(9), 767-771. doi:10.1016/j.aeue.2012.01.011Prabusankarlal, K. M., Thirumoorthy, P., & Manavalan, R. (2014). Computer Aided Breast Cancer Diagnosis Techniques in Ultrasound: A Survey. Journal of Medical Imaging and Health Informatics, 4(3), 331-349. doi:10.1166/jmihi.2014.1269Abdallah, Y. M., Elgak, S., Zain, H., Rafiq, M., A. Ebaid, E., & A. Elnaema, A. (2018). Breast cancer detection using image enhancement and segmentation algorithms. Biomedical Research, 29(20). doi:10.4066/biomedicalresearch.29-18-1106K.U, S., & S, G. R. (2016). Objective Quality Assessment of Image Enhancement Methods in Digital Mammography - A Comparative Study. Signal & Image Processing : An International Journal, 7(4), 01-13. doi:10.5121/sipij.2016.7401Pizer, S. M., Amburn, E. P., Austin, J. D., Cromartie, R., Geselowitz, A., Greer, T., … Zuiderveld, K. (1987). Adaptive histogram equalization and its variations. Computer Vision, Graphics, and Image Processing, 39(3), 355-368. doi:10.1016/s0734-189x(87)80186-xPisano, E. D., Zong, S., Hemminger, B. M., DeLuca, M., Johnston, R. E., Muller, K., … Pizer, S. M. (1998). Contrast Limited Adaptive Histogram Equalization image processing to improve the detection of simulated spiculations in dense mammograms. Journal of Digital Imaging, 11(4), 193-200. doi:10.1007/bf03178082Wan, J., Yin, H., Chong, A.-X., & Liu, Z.-H. (2020). Progressive residual networks for image super-resolution. Applied Intelligence, 50(5), 1620-1632. doi:10.1007/s10489-019-01548-8Umehara, K., Ota, J., & Ishida, T. (2017). Super-Resolution Imaging of Mammograms Based on the Super-Resolution Convolutional Neural Network. Open Journal of Medical Imaging, 07(04), 180-195. doi:10.4236/ojmi.2017.74018Dong, C., Loy, C. C., He, K., & Tang, X. (2016). Image Super-Resolution Using Deep Convolutional Networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 38(2), 295-307. doi:10.1109/tpami.2015.2439281Jiang, Y., & Li, J. (2020). Generative Adversarial Network for Image Super-Resolution Combining Texture Loss. Applied Sciences, 10(5), 1729. doi:10.3390/app10051729Schultz, R. R., & Stevenson, R. L. (1994). A Bayesian approach to image expansion for improved definition. IEEE Transactions on Image Processing, 3(3), 233-242. doi:10.1109/83.287017Lei Zhang, & Xiaolin Wu. (2006). An edge-guided image interpolation algorithm via directional filtering and data fusion. IEEE Transactions on Image Processing, 15(8), 2226-2238. doi:10.1109/tip.2006.877407Shorten, C., & Khoshgoftaar, T. M. (2019). A survey on Image Data Augmentation for Deep Learning. Journal of Big Data, 6(1). doi:10.1186/s40537-019-0197-0Weiss, K., Khoshgoftaar, T. M., & Wang, D. (2016). A survey of transfer learning. Journal of Big Data, 3(1). doi:10.1186/s40537-016-0043-6Ling Shao, Fan Zhu, & Xuelong Li. (2015). Transfer Learning for Visual Categorization: A Survey. IEEE Transactions on Neural Networks and Learning Syste
    corecore