660 research outputs found

    Ordinal HyperPlane Loss

    Get PDF
    This research presents the development of a new framework for analyzing ordered class data, commonly called “ordinal class” data. The focus of the work is the development of classifiers (predictive models) that predict classes from available data. Ratings scales, medical classification scales, socio-economic scales, meaningful groupings of continuous data, facial emotional intensity and facial age estimation are examples of ordinal data for which data scientists may be asked to develop predictive classifiers. It is possible to treat ordinal classification like any other classification problem that has more than two classes. Specifying a model with this strategy does not fully utilize the ordering information of classes. Alternatively, the researcher may choose to treat the ordered classes as though they are continuous values. This strategy imposes a strong assumption that the real “distance” between two adjacent classes is equal to the distance between two other adjacent classes (e.g., a rating of ‘0’ versus ‘1,’ on an 11-point scale is the same distance as a ‘9’ versus a ‘10’). For Deep Neural Networks (DNNs), the problem of predicting k ordinal classes is typically addressed by performing k-1 binary classifications. These models may be estimated within a single DNN and require an evaluation strategy to determine the class prediction. Another common option is to treat ordinal classes as continuous values for regression and then adjust the cutoff points that represent class boundaries that differentiate one class from another. This research reviews a novel loss function called Ordinal Hyperplane Loss (OHPL) that is particularly designed for data with ordinal classes. OHPLnet has been demonstrated to be a significant advancement in predicting ordinal classes for industry standard structured datasets. The loss function also enables deep learning techniques to be applied to the ordinal classification problem of unstructured data. By minimizing OHPL, a deep neural network learns to map data to an optimal space in which the distance between points and their class centroids are minimized while a nontrivial ordering relationship among classes are maintained. The research reported in this document advances OHPL loss, from a minimally viable loss function, to a more complete deep learning methodology. New analysis strategies were developed and tested that improve model performance as well as algorithm consistency in developing classification models. In the applications chapters, a new algorithm variant is introduced that enables OHPLall to be used when large data records cause a severe limitation on batch size when developing a related Deep Neural Network

    Segmentation and classification of lung nodules from Thoracic CT scans : methods based on dictionary learning and deep convolutional neural networks.

    Get PDF
    Lung cancer is a leading cause of cancer death in the world. Key to survival of patients is early diagnosis. Studies have demonstrated that screening high risk patients with Low-dose Computed Tomography (CT) is invaluable for reducing morbidity and mortality. Computer Aided Diagnosis (CADx) systems can assist radiologists and care providers in reading and analyzing lung CT images to segment, classify, and keep track of nodules for signs of cancer. In this thesis, we propose a CADx system for this purpose. To predict lung nodule malignancy, we propose a new deep learning framework that combines Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN) to learn best in-plane and inter-slice visual features for diagnostic nodule classification. Since a nodule\u27s volumetric growth and shape variation over a period of time may reveal information regarding the malignancy of nodule, separately, a dictionary learning based approach is proposed to segment the nodule\u27s shape at two time points from two scans, one year apart. The output of a CNN classifier trained to learn visual appearance of malignant nodules is then combined with the derived measures of shape change and volumetric growth in assigning a probability of malignancy to the nodule. Due to the limited number of available CT scans of benign and malignant nodules in the image database from the National Lung Screening Trial (NLST), we chose to initially train a deep neural network on the larger LUNA16 Challenge database which was built for the purpose of eliminating false positives from detected nodules in thoracic CT scans. Discriminative features that were learned in this application were transferred to predict malignancy. The algorithm for segmenting nodule shapes in serial CT scans utilizes a sparse combination of training shapes (SCoTS). This algorithm captures a sparse representation of a shape in input data through a linear span of previously delineated shapes in a training repository. The model updates shape prior over level set iterations and captures variabilities in shapes by a sparse combination of the training data. The level set evolution is therefore driven by a data term as well as a term capturing valid prior shapes. During evolution, the shape prior influence is adjusted based on shape reconstruction, with the assigned weight determined from the degree of sparsity of the representation. The discriminative nature of sparse representation, affords us the opportunity to compare nodules\u27 variations in consecutive time points and to predict malignancy. Experimental validations of the proposed segmentation algorithm have been demonstrated on 542 3-D lung nodule data from the LIDC-IDRI database which includes radiologist delineated nodule boundaries. The effectiveness of the proposed deep learning and dictionary learning architectures for malignancy prediction have been demonstrated on CT data from 370 biopsied subjects collected from the NLST database. Each subject in this database had at least two serial CT scans at two separate time points one year apart. The proposed RNN CAD system achieved an ROC Area Under the Curve (AUC) of 0.87, when validated on CT data from nodules at second sequential time point and 0.83 based on dictionary learning method; however, when nodule shape change and appearance were combined, the classifier performance improved to AUC=0.89

    Explainable artificial intelligence (XAI) in deep learning-based medical image analysis

    Get PDF
    With an increase in deep learning-based methods, the call for explainability of such methods grows, especially in high-stakes decision making areas such as medical image analysis. This survey presents an overview of eXplainable Artificial Intelligence (XAI) used in deep learning-based medical image analysis. A framework of XAI criteria is introduced to classify deep learning-based medical image analysis methods. Papers on XAI techniques in medical image analysis are then surveyed and categorized according to the framework and according to anatomical location. The paper concludes with an outlook of future opportunities for XAI in medical image analysis.Comment: Submitted for publication. Comments welcome by email to first autho

    RECENT CNN-BASED TECHNIQUES FOR BREAST CANCER HISTOLOGY IMAGE CLASSIFICATION

    Get PDF
    Histology images are extensively used by pathologists to assess abnormalities and detect malignancy in breast tissues. On the other hand, Convolutional Neural Networks (CNN) are by far, the privileged models for image classification and interpretation. Based on these two facts, we surveyed the recent CNN-based methods for breast cancer histology image analysis. The survey focuses on two major issues usually faced by CNN-based methods namely the design of an appropriate CNN architecture and the lack of a sufficient labelled dataset for training the model. Regarding the design of the CNN architecture, methods examining breast histology images adopt three main approaches: Designing manually from scratch the CNN architecture, using pre-trained models and adopting an automatic architecture design. Methods addressing the lack of labelled datasets are grouped into four categories: methods using pre-trained models, methods using data augmentation, methods adopting weakly supervised learning and those adopting feedforward filter learning. Research works from each category and reported performance are presented in this paper. We conclude the paper by indicating some future research directions related to the analysis of histology images

    Automatic Pulmonary Nodule Detection in CT Scans Using Convolutional Neural Networks Based on Maximum Intensity Projection

    Get PDF
    Accurate pulmonary nodule detection is a crucial step in lung cancer screening. Computer-aided detection (CAD) systems are not routinely used by radiologists for pulmonary nodule detection in clinical practice despite their potential benefits. Maximum intensity projection (MIP) images improve the detection of pulmonary nodules in radiological evaluation with computed tomography (CT) scans. Inspired by the clinical methodology of radiologists, we aim to explore the feasibility of applying MIP images to improve the effectiveness of automatic lung nodule detection using convolutional neural networks (CNNs). We propose a CNN-based approach that takes MIP images of different slab thicknesses (5 mm, 10 mm, 15 mm) and 1 mm axial section slices as input. Such an approach augments the two-dimensional (2-D) CT slice images with more representative spatial information that helps discriminate nodules from vessels through their morphologies. Our proposed method achieves sensitivity of 92.67% with 1 false positive per scan and sensitivity of 94.19% with 2 false positives per scan for lung nodule detection on 888 scans in the LIDC-IDRI dataset. The use of thick MIP images helps the detection of small pulmonary nodules (3 mm-10 mm) and results in fewer false positives. Experimental results show that utilizing MIP images can increase the sensitivity and lower the number of false positives, which demonstrates the effectiveness and significance of the proposed MIP-based CNNs framework for automatic pulmonary nodule detection in CT scans. The proposed method also shows the potential that CNNs could gain benefits for nodule detection by combining the clinical procedure.Comment: Submitted to IEEE TM

    Deep Learning in Breast Cancer Imaging: A Decade of Progress and Future Directions

    Full text link
    Breast cancer has reached the highest incidence rate worldwide among all malignancies since 2020. Breast imaging plays a significant role in early diagnosis and intervention to improve the outcome of breast cancer patients. In the past decade, deep learning has shown remarkable progress in breast cancer imaging analysis, holding great promise in interpreting the rich information and complex context of breast imaging modalities. Considering the rapid improvement in the deep learning technology and the increasing severity of breast cancer, it is critical to summarize past progress and identify future challenges to be addressed. In this paper, we provide an extensive survey of deep learning-based breast cancer imaging research, covering studies on mammogram, ultrasound, magnetic resonance imaging, and digital pathology images over the past decade. The major deep learning methods, publicly available datasets, and applications on imaging-based screening, diagnosis, treatment response prediction, and prognosis are described in detail. Drawn from the findings of this survey, we present a comprehensive discussion of the challenges and potential avenues for future research in deep learning-based breast cancer imaging.Comment: Survey, 41 page

    Radiomics analysis in ovarian cancer: A narrative review

    Get PDF
    Ovarian cancer (OC) is the second most common gynecological malignancy, accounting for about 14,000 deaths in 2020 in the US. The recognition of tools for proper screening, early diagnosis, and prognosis of OC is still lagging. The application of methods such as radiomics to medical images such as ultrasound scan (US), computed tomography (CT), magnetic resonance imaging (MRI), or positron emission tomography (PET) in OC may help to realize so-called “precision medicine” by developing new quantification metrics linking qualitative and/or quantitative data imaging to achieve clinical diagnostic endpoints. This narrative review aims to summarize the applications of radiomics as a support in the management of a complex pathology such as ovarian cancer. We give an insight into the current evidence on radiomics applied to different imaging methods

    PREDICTION OF RECURRENCE AND MORTALITY OF ORAL TONGUE CANCER USING ARTIFICIAL NEURAL NETWORK (A case study of 5 hospitals in Finland and 1 hospital from Sao Paulo, Brazil)

    Get PDF
    Cancer is a dreadful disease that had caused the death of millions of people. It is characterized by an uncontrollable growth of cell to form lumps or masses of tissue that are known as tumour. Therefore, it is a concern to all and sundry as these tumours mostly release hormones which have negative impact on the body system. Data mining approaches, statistical methods and machine learning algorithms have been proposed for effective cancer data classification. Artificial Neural Networks (ANN) have been used in this thesis for the prediction of recurrence and mortality of oral tongue cancer in patients. Similarly, ANN was also used to examine the diagnostic and prognostic factors. This was aimed at determining which of these diagnostic and prognostics factors had influence on the prediction of recurrence and mortality of oral tongue cancer in patients. Three different ANN have been applied for the learning and testing phases. The aim was to find the most effective technique. They are Elman, Feedforward, and Layer Recurrent neural networks techniques. Elman neural network was not able to make acceptable prediction of the recurrence or the mortality of tongue cancer based on the data. In contrast, Feedforward neural network captured the relationship between the prognostic factors and correctly predicted recurrence. However, it failed to predict the mortality based on the patient's data. Layer Recurrence neural network has been very effective and successfully predicted the recurrence and the mortality of oral tongue cancer in patients. The constructed layered recurrence neural network has been used to investigate the correlation between the prognostic factors. It was found that out of 11 prognostic factors in the data sheet, it was only 5 of them that had considerable impact on the recurrence and mortality. These are grade, depth, budding, modified stage, and gender. Time in months and disease free months were also used to train the network.fi=Opinnäytetyö kokotekstinä PDF-muodossa.|en=Thesis fulltext in PDF format.|sv=Lärdomsprov tillgängligt som fulltext i PDF-format
    corecore