82,143 research outputs found
Automated artemia length measurement using U-shaped fully convolutional networks and second-order anisotropic Gaussian kernels
The brine shrimp Artemia, a small crustacean zooplankton organism, is universally used as live prey for larval fish and shrimps in aquaculture. In Artemia studies, it would be highly desired to have access to automated techniques to obtain the length information from Anemia images. However, this problem has so far not been addressed in literature. Moreover, conventional image-based length measurement approaches cannot be readily transferred to measure the Artemia length, due to the distortion of non-rigid bodies, the variation over growth stages and the interference from the antennae and other appendages. To address this problem, we compile a dataset containing 250 images as well as the corresponding label maps of length measuring lines. We propose an automated Anemia length measurement method using U-shaped fully convolutional networks (UNet) and second-order anisotropic Gaussian kernels. For a given Artemia image, the designed UNet model is used to extract a length measuring line structure, and, subsequently, the second-order Gaussian kernels are employed to transform the length measuring line structure into a thin measuring line. For comparison, we also follow conventional fish length measurement approaches and develop a non-learning-based method using mathematical morphology and polynomial curve fitting. We evaluate the proposed method and the competing methods on 100 test images taken from the dataset compiled. Experimental results show that the proposed method can accurately measure the length of Artemia objects in images, obtaining a mean absolute percentage error of 1.16%
Cutting out the middleman: measuring nuclear area in histopathology slides without segmentation
The size of nuclei in histological preparations from excised breast tumors is
predictive of patient outcome (large nuclei indicate poor outcome).
Pathologists take into account nuclear size when performing breast cancer
grading. In addition, the mean nuclear area (MNA) has been shown to have
independent prognostic value. The straightforward approach to measuring nuclear
size is by performing nuclei segmentation. We hypothesize that given an image
of a tumor region with known nuclei locations, the area of the individual
nuclei and region statistics such as the MNA can be reliably computed directly
from the image data by employing a machine learning model, without the
intermediate step of nuclei segmentation. Towards this goal, we train a deep
convolutional neural network model that is applied locally at each nucleus
location, and can reliably measure the area of the individual nuclei and the
MNA. Furthermore, we show how such an approach can be extended to perform
combined nuclei detection and measurement, which is reminiscent of
granulometry.Comment: Conditionally accepted for MICCAI 201
Intima-Media Thickness: Setting a Standard for a Completely Automated Method of Ultrasound Measurement
The intima - media thickness (IMT) of the common carotid artery is a widely used clinical marker of severe cardiovascular diseases. IMT is usually manually measured on longitudinal B-Mode ultrasound images. Many computer-based techniques for IMT measurement have been proposed to overcome the limits of manual segmentation. Most of these, however, require a certain degree of user interaction. In this paper we describe a new completely automated layers extraction (CALEXia) technique for the segmentation and IMT measurement of carotid wall in ultrasound images. CALEXia is based on an integrated approach consisting of feature extraction, line fitting, and classification that enables the automated tracing of the carotid adventitial walls. IMT is then measured by relying on a fuzzy K-means classifier. We tested CALEXia on a database of 200 images. We compared CALEXia performances to those of a previously developed methodology that was based on signal analysis (CULEXsa). Three trained operators manually segmented the images and the average profiles were considered as the ground truth. The average error from CALEXia for lumen - intima (LI) and media - adventitia (MA) interface tracings were 1.46 ± 1.51 pixel (0.091 ± 0.093 mm) and 0.40 ± 0.87 pixel (0.025 ± 0.055 mm), respectively. The corresponding errors for CULEXsa were 0.55 ± 0.51 pixels (0.035 ± 0.032 mm) and 0.59 ± 0.46 pixels (0.037 ± 0.029 mm). The IMT measurement error was equal to 0.87 ± 0.56 pixel (0.054 ± 0.035 mm) for CALEXia and 0.12 ± 0.14 pixel (0.01 ± 0.01 mm) for CULEXsa. Thus, CALEXia showed limited performance in segmenting the LI interface, but outperformed CULEXsa in the MA interface and in the number of images correctly processed (10 for CALEXia and 16 for CULEXsa). Based on two complementary strategies, we anticipate fusing them for further IMT improvement
Fuzzy-based Propagation of Prior Knowledge to Improve Large-Scale Image Analysis Pipelines
Many automatically analyzable scientific questions are well-posed and offer a
variety of information about the expected outcome a priori. Although often
being neglected, this prior knowledge can be systematically exploited to make
automated analysis operations sensitive to a desired phenomenon or to evaluate
extracted content with respect to this prior knowledge. For instance, the
performance of processing operators can be greatly enhanced by a more focused
detection strategy and the direct information about the ambiguity inherent in
the extracted data. We present a new concept for the estimation and propagation
of uncertainty involved in image analysis operators. This allows using simple
processing operators that are suitable for analyzing large-scale 3D+t
microscopy images without compromising the result quality. On the foundation of
fuzzy set theory, we transform available prior knowledge into a mathematical
representation and extensively use it enhance the result quality of various
processing operators. All presented concepts are illustrated on a typical
bioimage analysis pipeline comprised of seed point detection, segmentation,
multiview fusion and tracking. Furthermore, the functionality of the proposed
approach is validated on a comprehensive simulated 3D+t benchmark data set that
mimics embryonic development and on large-scale light-sheet microscopy data of
a zebrafish embryo. The general concept introduced in this contribution
represents a new approach to efficiently exploit prior knowledge to improve the
result quality of image analysis pipelines. Especially, the automated analysis
of terabyte-scale microscopy data will benefit from sophisticated and efficient
algorithms that enable a quantitative and fast readout. The generality of the
concept, however, makes it also applicable to practically any other field with
processing strategies that are arranged as linear pipelines.Comment: 39 pages, 12 figure
- …