938 research outputs found

    Context-aware stacked convolutional neural networks for classification of breast carcinomas in whole-slide histopathology images

    Full text link
    Automated classification of histopathological whole-slide images (WSI) of breast tissue requires analysis at very high resolutions with a large contextual area. In this paper, we present context-aware stacked convolutional neural networks (CNN) for classification of breast WSIs into normal/benign, ductal carcinoma in situ (DCIS), and invasive ductal carcinoma (IDC). We first train a CNN using high pixel resolution patches to capture cellular level information. The feature responses generated by this model are then fed as input to a second CNN, stacked on top of the first. Training of this stacked architecture with large input patches enables learning of fine-grained (cellular) details and global interdependence of tissue structures. Our system is trained and evaluated on a dataset containing 221 WSIs of H&E stained breast tissue specimens. The system achieves an AUC of 0.962 for the binary classification of non-malignant and malignant slides and obtains a three class accuracy of 81.3% for classification of WSIs into normal/benign, DCIS, and IDC, demonstrating its potentials for routine diagnostics

    Efficient breast cancer classification network with dual squeeze and excitation in histopathological images.

    Get PDF
    Medical image analysis methods for mammograms, ultrasound, and magnetic resonance imaging (MRI) cannot provide the underline features on the cellular level to understand the cancer microenvironment which makes them unsuitable for breast cancer subtype classification study. In this paper, we propose a convolutional neural network (CNN)-based breast cancer classification method for hematoxylin and eosin (H&E) whole slide images (WSIs). The proposed method incorporates fused mobile inverted bottleneck convolutions (FMB-Conv) and mobile inverted bottleneck convolutions (MBConv) with a dual squeeze and excitation (DSE) network to accurately classify breast cancer tissue into binary (benign and malignant) and eight subtypes using histopathology images. For that, a pre-trained EfficientNetV2 network is used as a backbone with a modified DSE block that combines the spatial and channel-wise squeeze and excitation layers to highlight important low-level and high-level abstract features. Our method outperformed ResNet101, InceptionResNetV2, and EfficientNetV2 networks on the publicly available BreakHis dataset for the binary and multi-class breast cancer classification in terms of precision, recall, and F1-score on multiple magnification levels

    Breast Cancer Detection Based on Simplified Deep Learning Technique With Histopathological Image Using BreaKHis Database

    Get PDF
    Presented here are the results of an investigation conducted to determine the effectiveness of deep learning (DL)-based systems utilizing the power of transfer learning for detecting breast cancer in histopathological images. It is shown that DL models that are not specifically developed for breast cancer detection can be trained using transfer learning to effectively detect breast cancer in histopathological images. The outcome of the analysis enables the selection of the best DL architecture for detecting cancer with high accuracy. This should facilitate pathologists to achieve early diagnoses of breast cancer and administer appropriate treatment to the patient. The experimental work here used the BreaKHis database consisting of 7909 histopathological pictures from 82 clinical breast cancer patients. The strategy presented for DL training uses various image processing techniques for extracting various feature patterns. This is followed by applying transfer learning techniques in the deep convolutional networks like ResNet, ResNeXt, SENet, Dual Path Net, DenseNet, NASNet, and Wide ResNet. Comparison with recent literature shows that ResNext-50, ResNext-101, DPN131, DenseNet-169 and NASNet-A provide an accuracy of 99.8%, 99.5%, 99.675%, 99.725%, and 99.4%, respectively, and outperform previous studies
    corecore