787 research outputs found

    Automated Breast Ultrasound Lesions Detection Using Convolutional Neural Networks

    Get PDF
    Breast lesion detection using ultrasound imaging is considered an important step of Computer-Aided Diagnosis systems. Over the past decade, researchers have demonstrated the possibilities to automate the initial lesion detection. However, the lack of a common dataset impedes research when comparing the performance of such algorithms. This paper proposes the use of deep learning approaches for breast ultrasound lesion detection and investigates three different methods: a Patch-based LeNet, a U-Net, and a transfer learning approach with a pretrained FCN-AlexNet. Their performance is compared against four state-of-the-art lesion detection algorithms (i.e. Radial Gradient Index, Multifractal Filtering, Rule-based Region Ranking and Deformable Part Models). In addition, this paper compares and contrasts two conventional ultrasound image datasets acquired from two different ultrasound systems. Dataset A comprises 306 (60 malignant and 246 benign) images and Dataset B comprises 163 (53 malignant and 110 benign) images. To overcome the lack of public datasets in this domain, Dataset B will be made available for research purposes. The results demonstrate an overall improvement by the deep learning approaches when assessed on both datasets in terms of True Positive Fraction, False Positives per image, and F-measure.authorsversionPeer reviewe

    Breast Ultrasound Region of Interest Detection and Lesion Localisation

    Get PDF
    © 2020 Elsevier B.V. In current breast ultrasound computer aided diagnosis systems, the radiologist preselects a region of interest (ROI) as an input for computerised breast ultrasound image analysis. This task is time consuming and there is inconsistency among human experts. Researchers attempting to automate the process of obtaining the ROIs have been relying on image processing and conventional machine learning methods. We propose the use of a deep learning method for breast ultrasound ROI detection and lesion localisation. We use the most accurate object detection deep learning framework – Faster-RCNN with Inception-ResNet-v2 – as our deep learning network. Due to the lack of datasets, we use transfer learning and propose a new 3-channel artificial RGB method to improve the overall performance. We evaluate and compare the performance of our proposed methods on two datasets (namely, Dataset A and Dataset B), i.e. within individual datasets and composite dataset. We report the lesion detection results with two types of analysis: (1) detected point (centre of the segmented region or the detected bounding box) and (2) Intersection over Union (IoU). Our results demonstrate that the proposed methods achieved comparable results on detected point but with notable improvement on IoU. In addition, our proposed 3-channel artificial RGB method improves the recall of Dataset A. Finally, we outline some future directions for the research

    A Survey on Deep Learning in Medical Image Analysis

    Full text link
    Deep learning algorithms, in particular convolutional networks, have rapidly become a methodology of choice for analyzing medical images. This paper reviews the major deep learning concepts pertinent to medical image analysis and summarizes over 300 contributions to the field, most of which appeared in the last year. We survey the use of deep learning for image classification, object detection, segmentation, registration, and other tasks and provide concise overviews of studies per application area. Open challenges and directions for future research are discussed.Comment: Revised survey includes expanded discussion section and reworked introductory section on common deep architectures. Added missed papers from before Feb 1st 201

    Deep learning in medical imaging and radiation therapy

    Full text link
    Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/146980/1/mp13264_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/146980/2/mp13264.pd

    Needle and Biopsy Robots: a Review

    Get PDF
    Purpose of the review Robotics is a rapidly advancing field, and its introduction in healthcare can have a multitude of benefits for clinical practice. Especially, applications depending on the radiologist\u2019s accuracy and precision, such as percutaneous interventions, may profit. This paper provides an overview of recent robot-assisted percutaneous solutions. Recent findings Percutaneous interventions are relatively simple and the quality of the procedure increases a lot by introducing robotics due to the improved accuracy and precision. The success of the procedure is heavily dependent on the ability to merge pre- and intraoperative images, as an accurate estimation of the current target location allows to exploit the robot\u2019s capabilities. Summary Despite much research, the application of robotics in some branches of healthcare is not commonplace yet. Recent advances in percutaneous robotic solutions and imaging are highlighted, as they will pave the way to more widespread implementation of robotics in clinical practic

    The feasibility of conducting manual image segmentation of 3D sonographic images of axillary lymph nodes

    Get PDF
    School of Allied Medical Professions ScholarshipThe current standards of axillary lymph node staging following cancer diagnosis involve significant ionization, morbidity, and cost. While sonography has not yet been used in this capacity, three dimensional ultrasound has been used in the modeling of breast tumors with some success. This study was interested in determining whether 3D ultrasound could be used to manually segment axillary lymph node images in order to determine a reliable volume for later possible clinical significance. Fourteen volunteers were collected and one to three of their axillary lymph nodes imaged using a 4D16L GE transducer, specially designed to make volume measurements. Each node was then manually segmented using a deformable snake by two licensed sonographers and one untrained student and their volume results compared. Between the licensed sonographers for nodes 1, 2, and 3 correlations of r = 0.9 were achieved. Between the student and Sonographer A correlations of r = 0.6, 0.9, and 0.9, respectively, were accomplished, while with Sonographer B the correlations were r = 0.5, 0.9, and 0.9. This demonstrated the feasibility and validity of this technology, as well as indicated the importance of increased operator training. Three dimensional ultrasound should not be ruled out as a non-ionizing, less invasive alternative in lymph node staging. Advisor: Dr. Kevin EvansNo embarg

    Machine Learning in Robotic Ultrasound Imaging: Challenges and Perspectives

    Full text link
    This article reviews the recent advances in intelligent robotic ultrasound (US) imaging systems. We commence by presenting the commonly employed robotic mechanisms and control techniques in robotic US imaging, along with their clinical applications. Subsequently, we focus on the deployment of machine learning techniques in the development of robotic sonographers, emphasizing crucial developments aimed at enhancing the intelligence of these systems. The methods for achieving autonomous action reasoning are categorized into two sets of approaches: those relying on implicit environmental data interpretation and those using explicit interpretation. Throughout this exploration, we also discuss practical challenges, including those related to the scarcity of medical data, the need for a deeper understanding of the physical aspects involved, and effective data representation approaches. Moreover, we conclude by highlighting the open problems in the field and analyzing different possible perspectives on how the community could move forward in this research area.Comment: Accepted by Annual Review of Control, Robotics, and Autonomous System

    Automated Deformable Mapping Methods to Relate Corresponding Lesions in 3D X-ray and 3D Ultrasound Breast Images

    Full text link
    Mammography is the current standard imaging method for detecting breast cancer by using x-rays to produce 2D images of the breast. However, with mammography alone there is difficulty determining whether a lesion is benign or malignant and reduced sensitivity to detecting lesions in dense breasts. Ultrasound imaging used in conjunction with mammography has shown valuable contributions for lesion characterization by differentiating between solid and cystic lesions. Conventional breast ultrasound has high false positive rates; however, it has shown improved abilities to detect lesions in dense breasts. Breast ultrasound is typically performed freehand to produce anterior-to-posterior 2D images in a different geometry (supine) than mammography (upright). This difference in geometries is likely responsible for the finding that at least 10% of the time lesions found in the ultrasound images do not correspond with lesions found in mammograms. To solve this problem additional imaging techniques must be investigated to aid a radiologist in identifying corresponding lesions in the two modalities to ensure early detection of a potential cancer. This dissertation describes and validates automated deformable mapping methods to register and relate corresponding lesions between multi-modality images acquired using 3D mammography (Digital Breast Tomosynthesis (DBT) and dedicated breast Computed Tomography (bCT)) and 3D ultrasound (Automated Breast Ultrasound (ABUS)). The methodology involves the use of finite element modeling and analysis to simulate the differences in compression and breast orientation to better align lesions acquired from images from these modalities. Preliminary studies were performed using several multimodality compressible breast phantoms to determine breast lesion registrations between: i) cranio-caudal (CC) and mediolateral oblique (MLO) DBT views and ABUS, ii) simulated bCT and DBT (CC and MLO views), and iii) simulated bCT and ABUS. Distances between the centers of masses, dCOM, of corresponding lesions were used to assess the deformable mapping method. These phantom studies showed the potential to apply this technique for real breast lesions with mean dCOM registration values as low as 4.9 ± 2.4 mm for DBT (CC view) mapped to ABUS, 9.3 ± 2.8 mm for DBT (MLO view) mapped to ABUS, 4.8 ± 2.4 mm for bCT mapped to ABUS, 5.0 ± 2.2 mm for bCT mapped to DBT (CC view), and 4.7 ± 2.5 mm for bCT mapped to DBT (MLO view). All of the phantom studies showed that using external fiducial markers helped improve the registration capability of the deformable mapping algorithm. An IRB-approved proof-of-concept study was performed with patient volunteers to validate the deformable registration method on 5 patient datasets with a total of up to 7 lesions for DBT (CC and MLO views) to ABUS registration. Resulting dCOM’s using the deformable method showed statistically significant improvements over rigid registration techniques with a mean dCOM of 11.6 ± 5.3 mm for DBT (CC view) mapped to ABUS and a mean dCOM of 12.3 ± 4.8 mm for DBT (MLO view) mapped to ABUS. The present work demonstrates the potential for using deformable registration techniques to relate corresponding lesions in 3D x-ray and 3D ultrasound images. This methodology should improve a radiologists’ characterization of breast lesions which can reduce patient callbacks, misdiagnoses, additional patient dose and unnecessary biopsies. Additionally, this technique can save a radiologist time in navigating 3D image volumes and the one-to-one lesion correspondence between modalities can aid in the early detection of breast malignancies.PHDNuclear Engineering & Radiological SciencesUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/150042/1/canngree_1.pd
    • …
    corecore