32 research outputs found
Improving Prostate Cancer Detection with Breast Histopathology Images
Deep neural networks have introduced significant advancements in the field of
machine learning-based analysis of digital pathology images including prostate
tissue images. With the help of transfer learning, classification and
segmentation performance of neural network models have been further increased.
However, due to the absence of large, extensively annotated, publicly available
prostate histopathology datasets, several previous studies employ datasets from
well-studied computer vision tasks such as ImageNet dataset. In this work, we
propose a transfer learning scheme from breast histopathology images to improve
prostate cancer detection performance. We validate our approach on annotated
prostate whole slide images by using a publicly available breast histopathology
dataset as pre-training. We show that the proposed cross-cancer approach
outperforms transfer learning from ImageNet dataset.Comment: 9 pages, 2 figure
Deep Convolutional Neural Networks for Breast Cancer Histology Image Analysis
Breast cancer is one of the main causes of cancer death worldwide. Early
diagnostics significantly increases the chances of correct treatment and
survival, but this process is tedious and often leads to a disagreement between
pathologists. Computer-aided diagnosis systems showed potential for improving
the diagnostic accuracy. In this work, we develop the computational approach
based on deep convolution neural networks for breast cancer histology image
classification. Hematoxylin and eosin stained breast histology microscopy image
dataset is provided as a part of the ICIAR 2018 Grand Challenge on Breast
Cancer Histology Images. Our approach utilizes several deep neural network
architectures and gradient boosted trees classifier. For 4-class classification
task, we report 87.2% accuracy. For 2-class classification task to detect
carcinomas we report 93.8% accuracy, AUC 97.3%, and sensitivity/specificity
96.5/88.0% at the high-sensitivity operating point. To our knowledge, this
approach outperforms other common methods in automated histopathological image
classification. The source code for our approach is made publicly available at
https://github.com/alexander-rakhlin/ICIAR2018Comment: 8 pages, 4 figure
Structure Preserving Stain Normalization of Histopathology Images Using Self Supervised Semantic Guidance
© 2020, Springer Nature Switzerland AG. Although generative adversarial network (GAN) based style transfer is state of the art in histopathology color-stain normalization, they do not explicitly integrate structural information of tissues. We propose a self-supervised approach to incorporate semantic guidance into a GAN based stain normalization framework and preserve detailed structural information. Our method does not require manual segmentation maps which is a significant advantage over existing methods. We integrate semantic information at different layers between a pre-trained semantic network and the stain color normalization network. The proposed scheme outperforms other color normalization methods leading to better classification and segmentation performance
Bringing Open Data to Whole Slide Imaging
Supplementary information associated with Besson et al. (2019) ECDP 2019
Faced with the need to support a growing number of whole slide imaging (WSI) file formats, our team has extended a long-standing community file format (OME-TIFF) for use in digital pathology. The format makes use of the core TIFF specification to store multi-resolution (or "pyramidal") representations of a single slide in a flexible, performant manner. Here we describe the structure of this format, its performance characteristics, as well as an open-source library support for reading and writing pyramidal OME-TIFFs
Artificial Intelligence-based methods in head and neck cancer diagnosis : an overview
Background
This paper reviews recent literature employing Artificial Intelligence/Machine Learning (AI/ML) methods for diagnostic evaluation of head and neck cancers (HNC) using automated image analysis.
Methods
Electronic database searches using MEDLINE via OVID, EMBASE and Google Scholar were conducted to retrieve articles using AI/ML for diagnostic evaluation of HNC (2009–2020). No restrictions were placed on the AI/ML method or imaging modality used.
Results
In total, 32 articles were identified. HNC sites included oral cavity (n = 16), nasopharynx (n = 3), oropharynx (n = 3), larynx (n = 2), salivary glands (n = 2), sinonasal (n = 1) and in five studies multiple sites were studied. Imaging modalities included histological (n = 9), radiological (n = 8), hyperspectral (n = 6), endoscopic/clinical (n = 5), infrared thermal (n = 1) and optical (n = 1). Clinicopathologic/genomic data were used in two studies. Traditional ML methods were employed in 22 studies (69%), deep learning (DL) in eight studies (25%) and a combination of these methods in two studies (6%).
Conclusions
There is an increasing volume of studies exploring the role of AI/ML to aid HNC detection using a range of imaging modalities. These methods can achieve high degrees of accuracy that can exceed the abilities of human judgement in making data predictions. Large-scale multi-centric prospective studies are required to aid deployment into clinical practice
Uberwasted App for Reporting and Collecting Waste Using Location Based and Deep Learning Technologies
Nowadays, waste becomes one of the main air pollution sources. In this context, Uberwasted app is studied and developed as a mobile app that allows volunteers to report and to take part in waste collection based on location based technology and deep learning algorithms. First, a waste data management system has been built to store waste photo and descriptions, which are submitted by using the app. Furthermore, waste are classified by applying convolutional neural network model called “Resnet34”, then reported to network of volunteers the place to be collected based on location based technology. To proof of our conceptual approach, several typical implementation results will be illustrated