13 research outputs found

    Computer vision enables short- and long-term analysis of <i>Lophelia pertusa</i> polyp behaviour and colour from an underwater observatory

    Get PDF
    An array of sensors, including an HD camera mounted on a Fixed Underwater Observatory (FUO) were used to monitor a cold-water coral (Lophelia pertusa) reef in the Lofoten-Vesterålen area from April to November 2015. Image processing and deep learning enabled extraction of time series describing changes in coral colour and polyp activity (feeding). The image data was analysed together with data from the other sensors from the same period, to provide new insights into the short- and long-term dynamics in polyp features. The results indicate that diurnal variations and tidal current influenced polyp activity, by controlling the food supply. On a longer time-scale, the coral’s tissue colour changed from white in the spring to slightly red during the summer months, which can be explained by a seasonal change in food supply. Our work shows, that using an effective integrative computational approach, the image time series is a new and rich source of information to understand and monitor the dynamics in underwater environments due to the high temporal resolution and coverage enabled with FUOs

    Principles of Bioimage Informatics: Focus on Machine Learning of Cell Patterns

    Full text link
    Abstract. The field of bioimage informatics concerns the development and use of methods for computational analysis of biological images. Traditionally, analysis of such images has been done manually. Manual annotation is, however, slow, expensive, and often highly variable from one expert to another. Furthermore, with modern automated microscopes, hundreds to thousands of images can be collected per hour, making manual analysis infeasible. This field borrows from the pattern recognition and computer vision literature (which contain many techniques for image processing and recognition), but has its own unique challenges and tradeoffs. Fluorescence microscopy images represent perhaps the largest class of biological images for which automation is needed. For this modality, typical problems include cell segmentation, classification of phenotypical response, or decisions regarding differentiated responses (treatment vs. control setting). This overview focuses on the problem of subcellular location determination as a running example, but the techniques discussed are often applicable to other problems.

    An Adaptive Tissue Characterisation Network for Model-Free Visualisation of Dynamic Contrast-Enhanced Magnetic Resonance Image Data

    No full text
    Dynamic contrast-enhanced magnetic resonanceimaging (DCE-MRI) has become an important source of informationto aid cancer diagnosis. Nevertheless, due to the multi-temporalnature of the three-dimensional volume data obtained fromDCE-MRI, evaluation of the image data is a challenging task andtools are required to support the human expert. We investigatean approach for automatic localization and characterization ofsuspicious lesions in DCE-MRI data. It applies an artificial neuralnetwork (ANN) architecture which combines unsupervised andsupervised techniques for voxel-by-voxel classification of temporalkinetic signals. The algorithm is easy to implement, allows forfast training and application even for huge data sets and canbe directly used to augment the display of DCE-MRI data. Todemonstrate that the system provides a reasonable assessment ofkinetic signals, the outcome is compared with the results obtainedfrom the model-based three-time-points (3TP) technique whichrepresents a clinical standard protocol for analysing breast cancerlesions. The evaluation based on the DCE-MRI data of 12 casesindicates that, although the ANN is trained with impreciselylabeled data, the approach leads to an outcome conforming with3TP without presupposing an explicit model of the underlyingphysiological process

    An Adaptive Tissue Characterisation Network for Model-Free Visualisation of Dynamic Contrast-Enhanced Magnetic Resonance Image Data

    No full text
    Dynamic contrast-enhanced magnetic resonanceimaging (DCE-MRI) has become an important source of informationto aid cancer diagnosis. Nevertheless, due to the multi-temporalnature of the three-dimensional volume data obtained fromDCE-MRI, evaluation of the image data is a challenging task andtools are required to support the human expert. We investigatean approach for automatic localization and characterization ofsuspicious lesions in DCE-MRI data. It applies an artificial neuralnetwork (ANN) architecture which combines unsupervised andsupervised techniques for voxel-by-voxel classification of temporalkinetic signals. The algorithm is easy to implement, allows forfast training and application even for huge data sets and canbe directly used to augment the display of DCE-MRI data. Todemonstrate that the system provides a reasonable assessment ofkinetic signals, the outcome is compared with the results obtainedfrom the model-based three-time-points (3TP) technique whichrepresents a clinical standard protocol for analysing breast cancerlesions. The evaluation based on the DCE-MRI data of 12 casesindicates that, although the ANN is trained with impreciselylabeled data, the approach leads to an outcome conforming with3TP without presupposing an explicit model of the underlyingphysiological process

    Pseudo-color visualizations of DCE-MR image series for MR mammography

    No full text
    In recent years, dynamic contrast-enhanced magnetic resonance (DCE-MR) imaging has become a valuable tool for detection, diagnosis and management of breast cancer. Several criteria for describing morphologic and dynamic characteristics of suspiciously enhancing tissue regions have been collected in the ACR BIRADS MRI lexicon. However, evaluation of these criteria is nonetheless a challenging task for human observers due to the huge amount and the multitemporal nature of the image data. Therefore, computer aided diagnosis (CAD) tools based on artificial neural networks (ANN) or pharamcokinetic models receive growing attention from the radiologic community. In DCE-MR imaging, each voxel is associated with a vector s = (s1, . . . , st) reflecting the temporal variation of the local signal intensity after intravenous administration of a contrast agent (Gd-DTPA). Due to changes in their vascular structure, benign and malignant tissue expose characteristic intensity-time curves (ITC). These curves enable radiologists to infer information about the tissue state from the image data, a time-consuming task owing to the heterogeneity of cancerous tissue. To aid evaluation of DCE-MR image series, we propose a pseudo-color visualization of the temporal information based on ANNs. An ANN is trained with labeled ITCs sampled from a number of histologically verified training cases to classify each temporal signal sx,y,z as being indicative for malignant (m), normal (n) or benign (b) tissue according to the returned posteriori probabilities p(m|s_x,y,z), p(n|s_x,y,z) and p(b|s_x,y,z). Pseudo-color visualizations of unseen image series are computed by displaying suspiciously enhancing voxels with RGB colors reflecting the ANN based signal assessment: bright red, green and blue voxels indicate high p(m|s_x,y,z), p(n|s_x,y,z) and p(b|s_x,y,z) values, respectively. Therewith, temporal characteristics of tissue regions are revealed, enabling radiologists to assess the architecture of lesions by means of a single 3D color image

    Computer vision enables short- and long-term analysis of <i>Lophelia pertusa</i> polyp behaviour and colour from an underwater observatory

    No full text
    An array of sensors, including an HD camera mounted on a Fixed Underwater Observatory (FUO) were used to monitor a cold-water coral (Lophelia pertusa) reef in the Lofoten-Vesterålen area from April to November 2015. Image processing and deep learning enabled extraction of time series describing changes in coral colour and polyp activity (feeding). The image data was analysed together with data from the other sensors from the same period, to provide new insights into the short- and long-term dynamics in polyp features. The results indicate that diurnal variations and tidal current influenced polyp activity, by controlling the food supply. On a longer time-scale, the coral’s tissue colour changed from white in the spring to slightly red during the summer months, which can be explained by a seasonal change in food supply. Our work shows, that using an effective integrative computational approach, the image time series is a new and rich source of information to understand and monitor the dynamics in underwater environments due to the high temporal resolution and coverage enabled with FUOs

    Automatic Segmentation of Unstained Living Cells in Bright-Field Microscope Images

    Get PDF
    Tscherepanow M, Zöllner F, Hillebrand M, Kummert F. Automatic Segmentation of Unstained Living Cells in Bright-Field Microscope Images. In: Perner P, Salvetti O, eds. Proceedings of the International Conference on Mass-Data Analysis of Images and Signals (MDA). Berlin: Springer; 2008: 158-172.The automatic subcellular localisation of proteins in living cells is a critical step in determining their function. The evaluation of fluorescence images constitutes a common method of localising these proteins. For this, additional knowledge about the position of the considered cells within an image is required. In an automated system, it is advantageous to recognise these cells in bright-field microscope images taken in parallel with the regarded fluorescence micrographs. Unfortunately, currently available cell recognition methods are only of limited use within the context of protein localisation, since they frequently require microscopy techniques that enable images of higher contrast (e.g. phase contrast microscopy or additional dyes) or can only be employed with too low magnifications. Therefore, this article introduces a novel approach to the robust automatic recognition of unstained living cells in bright-field microscope images. Here, the focus is on the automatic segmentation of cells
    corecore