574 research outputs found

    Histopathological image analysis : a review

    Get PDF
    Over the past decade, dramatic increases in computational power and improvement in image analysis algorithms have allowed the development of powerful computer-assisted analytical approaches to radiological data. With the recent advent of whole slide digital scanners, tissue histopathology slides can now be digitized and stored in digital image form. Consequently, digitized tissue histopathology has now become amenable to the application of computerized image analysis and machine learning techniques. Analogous to the role of computer-assisted diagnosis (CAD) algorithms in medical imaging to complement the opinion of a radiologist, CAD algorithms have begun to be developed for disease detection, diagnosis, and prognosis prediction to complement the opinion of the pathologist. In this paper, we review the recent state of the art CAD technology for digitized histopathology. This paper also briefly describes the development and application of novel image analysis technology for a few specific histopathology related problems being pursued in the United States and Europe

    Ensemble Learning of Tissue Components for Prostate Histopathology Image Grading

    Get PDF
    Ensemble learning is an effective machine learning approach to improve the prediction performance by fusing several single classifier models. In computer-aided diagnosis system (CAD), machine learning has become one of the dominant solutions for tissue images diagnosis and grading. One problem in a single classifier model for multi-components of the tissue images combination to construct dense feature vectors is the overfitting. In this paper, an ensemble learning for multi-component tissue images classification approach is proposed. The prostate cancer Hematoxylin and Eosin (H&E) histopathology images from HUKM were used to test the proposed ensemble approach for diagnosing and Gleason grading. The experiments results of several prostate classification tasks, namely, benign vs. Grade 3, benign vs.Grade4, and Grade 3vs.Grade 4 show that the proposed ensemble significantly outperforms the previous typical CAD and the naïve approach that combines the texture features of all tissue component directly in dense feature vectors for a classifier

    Computational Histological Staining and Destaining of Prostate Core Biopsy RGB Images with Generative Adversarial Neural Networks

    Full text link
    Histopathology tissue samples are widely available in two states: paraffin-embedded unstained and non-paraffin-embedded stained whole slide RGB images (WSRI). Hematoxylin and eosin stain (H&E) is one of the principal stains in histology but suffers from several shortcomings related to tissue preparation, staining protocols, slowness and human error. We report two novel approaches for training machine learning models for the computational H&E staining and destaining of prostate core biopsy RGB images. The staining model uses a conditional generative adversarial network that learns hierarchical non-linear mappings between whole slide RGB image (WSRI) pairs of prostate core biopsy before and after H&E staining. The trained staining model can then generate computationally H&E-stained prostate core WSRIs using previously unseen non-stained biopsy images as input. The destaining model, by learning mappings between an H&E stained WSRI and a non-stained WSRI of the same biopsy, can computationally destain previously unseen H&E-stained images. Structural and anatomical details of prostate tissue and colors, shapes, geometries, locations of nuclei, stroma, vessels, glands and other cellular components were generated by both models with structural similarity indices of 0.68 (staining) and 0.84 (destaining). The proposed staining and destaining models can engender computational H&E staining and destaining of WSRI biopsies without additional equipment and devices.Comment: Accepted for publication at 2018 IEEE International Conference on Machine Learning and Applications (ICMLA

    Identification of Individual Glandular Regions Using LCWT and Machine Learning Techniques

    Full text link
    A new approach for the segmentation of gland units in histological images is proposed with the aim of contributing to the improvement of the prostate cancer diagnosis. Clustering methods on several colour spaces are applied to each sample in order to generate a binary mask of the different tissue components. From the mask of lumen candidates, the Locally Constrained Watershed Transform (LCWT) is applied as a novel gland segmentation technique never before used in this type of images. 500 random gland candidates, both benign and pathological, are selected to evaluate the LCWT technique providing results of Dice coefficient of 0.85. Several shape and textural descriptors in combination with contextual features and a fractal analysis are applied, in a novel way, on different colour spaces achieving a total of 297 features to discern between artefacts and true glands. The most relevant features are then selected by an exhaustive statistical analysis in terms of independence between variables and dependence with the class. 3.200 artefacts, 3.195 benign glands and 3.000 pathological glands are obtained, from a data set of 1468 images at 10x magnification. A careful strategy of data partition is implemented to robustly address the classification problem between artefacts and glands. Both linear and non-linear approaches are considered using machine learning techniques based on Support Vector Machines (SVM) and feedforward neural networks achieving values of sensitivity, specificity and accuracy of 0.92, 0.97 and 0.95, respectivelyThis work has been funded by the Ministry of Economy, Industry and Competitiveness under the SICAP project (DPI2016-77869-C2-1-R). The work of Adri´an Colomer has been supported by the Spanish FPI Grant BES-2014-067889. We gratefully acknowledge the support of NVIDIA Corporation with the donation of the Titan Xp GPU used for this researchGarcía-Pardo, JG.; Colomer, A.; Naranjo Ornedo, V.; Peñaranda, F.; Sales, MÁ. (2018). Identification of Individual Glandular Regions Using LCWT and Machine Learning Techniques. En Intelligent Data Engineering and Automated Learning – IDEAL 2018. Springer. 642-650. https://doi.org/10.1007/978-3-030-03493-1_67S642650Gleason, D.F.: Histologic grading and clinical staging of prostatic carcinoma. In: Urologic Pathology (1977)Naik, S., Doyle, S., Feldman, M., Tomaszewski, J., Madabhushi, A.: Gland segmentation and computerized gleason grading of prostate histology by integrating low-, high-level and domain specific information. In: MIAAB Workshop, pp. 1–8 (2007)Nguyen, K., Sabata, B., Jain, A.K.: Prostate cancer grading: gland segmentation and structural features. Pattern Recogn. Lett. 33(7), 951–961 (2012)Kwak, J.T., Hewitt, S.M.: Multiview boosting digital pathology analysis of prostate cancer. Comput. Methods Programs Biomed. 142, 91–99 (2017)Ren, J., Sadimin, E., Foran, D.J., Qi, X.: Computer aided analysis of prostate histopathology images to support a refined gleason grading system. In: SPIE Medical Imaging, International Society for Optics and Photonics, p. 101331V (2017)Soille, P.: Morphological Image Analysis: Principles and Applications. Springer, Berlin (2013)Nguyen, K., Sarkar, A., Jain, A.K.: Structure and context in prostatic gland segmentation and classification. In: Ayache, N., Delingette, H., Golland, P., Mori, K. (eds.) MICCAI 2012. LNCS, vol. 7510, pp. 115–123. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33415-3_15Beare, R.: A locally constrained watershed transform. IEEE Trans. Pattern Anal. Mach. Intell. 28(7), 1063–1074 (2006)Gertych, A., et al.: Machine learning approaches to analyze histological images of tissues from radical prostatectomies. Comput. Med. Imaging Graph. 46, 197–208 (2015)Ojala, T., Pietikainen, M., Maenpaa, T.: Multiresolution gray-scale and rotation invariant texture classification with local binary patterns. IEEE Trans. Pattern Anal. Mach. Intell. 24(7), 971–987 (2002)Guo, Z., Zhang, L., Zhang, D.: A completed modeling of local binary pattern operator for texture classification. IEEE Trans. Image Process. 19(6), 1657–1663 (2010)Huang, P., Lee, C.: Automatic classification for pathological prostate images based on fractal analysis. IEEE Trans. Med. Imaging 28(7), 1037–1050 (2009)Ruifrok, A.C., Johnston, D.A., et al.: Quantification of histochemical staining by color deconvolution. Anal. Quant. Cytol. Histol. 23(4), 291–299 (2001

    Systems pathology by multiplexed immunohistochemistry and whole-slide digital image analysis

    Get PDF
    The paradigm of molecular histopathology is shifting from a single-marker immunohistochemistry towards multiplexed detection of markers to better understand the complex pathological processes. However, there are no systems allowing multiplexed IHC (mIHC) with high-resolution whole-slide tissue imaging and analysis, yet providing feasible throughput for routine use. We present an mIHC platform combining fluorescent and chromogenic staining with automated whole-slide imaging and integrated whole-slide image analysis, enabling simultaneous detection of six protein markers and nuclei, and automatic quantification and classification of hundreds of thousands of cells in situ in formalin-fixed paraffin-embedded tissues. In the first proof-of-concept, we detected immune cells at cell-level resolution (n = 128,894 cells) in human prostate cancer, and analysed T cell subpopulations in different tumour compartments (epithelium vs. stroma). In the second proof-of-concept, we demonstrated an automatic classification of epithelial cell populations (n = 83,558) and glands (benign vs. cancer) in prostate cancer with simultaneous analysis of androgen receptor (AR) and alpha-methylacyl-CoA (AMACR) expression at cell-level resolution. We conclude that the open-source combination of 8-plex mIHC detection, whole-slide image acquisition and analysis provides a robust tool allowing quantitative, spatially resolved whole-slide tissue cytometry directly in formalin-fixed human tumour tissues for improved characterization of histology and the tumour microenvironment.Peer reviewe

    Deep learning applications in the prostate cancer diagnostic pathway

    Get PDF
    Prostate cancer (PCa) is the second most frequently diagnosed cancer in men worldwide and the fifth leading cause of cancer death in men, with an estimated 1.4 million new cases in 2020 and 375,000 deaths. The risk factors most strongly associated to PCa are advancing age, family history, race, and mutations of the BRCA genes. Since the aforementioned risk factors are not preventable, early and accurate diagnoses are a key objective of the PCa diagnostic pathway. In the UK, clinical guidelines recommend multiparametric magnetic resonance imaging (mpMRI) of the prostate for use by radiologists to detect, score, and stage lesions that may correspond to clinically significant PCa (CSPCa), prior to confirmatory biopsy and histopathological grading. Computer-aided diagnosis (CAD) of PCa using artificial intelligence algorithms holds a currently unrealized potential to improve upon the diagnostic accuracy achievable by radiologist assessment of mpMRI, improve the reporting consistency between radiologists, and reduce reporting time. In this thesis, we build and evaluate deep learning-based CAD systems for the PCa diagnostic pathway, which address gaps identified in the literature. First, we introduce a novel patient-level classification framework, PCF, which uses a stacked ensemble of convolutional neural networks (CNNs) and support vector machines (SVMs) to assign a probability of having CSPCa to patients, using mpMRI and clinical features. Second, we introduce AutoProstate, a deep-learning powered framework for automated PCa assessment and reporting; AutoProstate utilizes biparametric MRI and clinical data to populate an automatic diagnostic report containing segmentations of the whole prostate, prostatic zones, and candidate CSPCa lesions, as well as several derived characteristics that are clinically valuable. Finally, as automatic segmentation algorithms have not yet reached the desired robustness for clinical use, we introduce interactive click-based segmentation applications for the whole prostate and prostatic lesions, with potential uses in diagnosis, active surveillance progression monitoring, and treatment planning

    Automatic classification of prostate cancer Gleason scores from biparametric MRI using deep convolutional neural networks

    Get PDF
    Prostate cancer is one of the most common types of cancer in the world. To reduce the number of deaths caused by it, effective diagnostic methods are of paramount importance to detect the clinically significant cases early enough. The current diagnostic protocols include, among other methods, magnetic resonance imaging which can be used to assess whether a patient suffers from prostate cancer and whether the possible cancer lesions are clinically significant. However, the images are difficult to interpret, and thus the inter-reader reliability is not very good. To address this problem, in this thesis machine learning models are trained to automatically segment and classify prostate cancer lesions from magnetic resonance images. The problem proved to be difficult even for computers, at least with the relatively small data set size. The highest Dice similarity coefficients for the used Gleason score groups approached 0.4, which is not enough to replace the work of professionals or even provide meaningful help for doctors. In conclusion, the task of automatic segmentation and classification of prostate cancer lesions remains an open problem. Improving the performance to a useful level would likely require a noticeably larger dataset or at least a model that better incorporates the knowledge of the trained professionals
    corecore