535 research outputs found
Globally Optimal Cell Tracking using Integer Programming
We propose a novel approach to automatically tracking cell populations in
time-lapse images. To account for cell occlusions and overlaps, we introduce a
robust method that generates an over-complete set of competing detection
hypotheses. We then perform detection and tracking simultaneously on these
hypotheses by solving to optimality an integer program with only one type of
flow variables. This eliminates the need for heuristics to handle missed
detections due to occlusions and complex morphology. We demonstrate the
effectiveness of our approach on a range of challenging sequences consisting of
clumped cells and show that it outperforms state-of-the-art techniques.Comment: Engin T\"uretken and Xinchao Wang contributed equally to this wor
Particle detection and tracking in fluorescence time-lapse imaging: a contrario approach
This paper proposes a probabilistic approach for the detection and the
tracking of particles in fluorescent time-lapse imaging. In the presence of a
very noised and poor-quality data, particles and trajectories can be
characterized by an a contrario model, that estimates the probability of
observing the structures of interest in random data. This approach, first
introduced in the modeling of human visual perception and then successfully
applied in many image processing tasks, leads to algorithms that neither
require a previous learning stage, nor a tedious parameter tuning and are very
robust to noise. Comparative evaluations against a well-established baseline
show that the proposed approach outperforms the state of the art.Comment: Published in Journal of Machine Vision and Application
Cell Segmentation and Tracking using CNN-Based Distance Predictions and a Graph-Based Matching Strategy
The accurate segmentation and tracking of cells in microscopy image sequences
is an important task in biomedical research, e.g., for studying the development
of tissues, organs or entire organisms. However, the segmentation of touching
cells in images with a low signal-to-noise-ratio is still a challenging
problem. In this paper, we present a method for the segmentation of touching
cells in microscopy images. By using a novel representation of cell borders,
inspired by distance maps, our method is capable to utilize not only touching
cells but also close cells in the training process. Furthermore, this
representation is notably robust to annotation errors and shows promising
results for the segmentation of microscopy images containing in the training
data underrepresented or not included cell types. For the prediction of the
proposed neighbor distances, an adapted U-Net convolutional neural network
(CNN) with two decoder paths is used. In addition, we adapt a graph-based cell
tracking algorithm to evaluate our proposed method on the task of cell
tracking. The adapted tracking algorithm includes a movement estimation in the
cost function to re-link tracks with missing segmentation masks over a short
sequence of frames. Our combined tracking by detection method has proven its
potential in the IEEE ISBI 2020 Cell Tracking Challenge
(http://celltrackingchallenge.net/) where we achieved as team KIT-Sch-GE
multiple top three rankings including two top performances using a single
segmentation model for the diverse data sets.Comment: 25 pages, 14 figures, methods of the team KIT-Sch-GE for the IEEE
ISBI 2020 Cell Tracking Challeng
Robot introspection through learned hidden Markov models
In this paper we describe a machine learning approach for acquiring a model of a robot behaviour from raw sensor data. We are interested in automating the acquisition of behavioural models to provide a robot with an introspective capability. We assume that the behaviour of a robot in achieving a task can be modelled as a finite stochastic state transition system. Beginning with data recorded by a robot in the execution of a task, we use unsupervised learning techniques to estimate a hidden Markov model (HMM) that can be used both for predicting and explaining the behaviour of the robot in subsequent executions of the task. We demonstrate that it is feasible to automate the entire process of learning a high quality HMM from the data recorded by the robot during execution of its task.The learned HMM can be used both for monitoring and controlling the behaviour of the robot. The ultimate purpose of our work is to learn models for the full set of tasks associated with a given problem domain, and to integrate these models with a generative task planner. We want to show that these models can be used successfully in controlling the execution of a plan. However, this paper does not develop the planning and control aspects of our work, focussing instead on the learning methodology and the evaluation of a learned model. The essential property of the models we seek to construct is that the most probable trajectory through a model, given the observations made by the robot, accurately diagnoses, or explains, the behaviour that the robot actually performed when making these observations. In the work reported here we consider a navigation task. We explain the learning process, the experimental setup and the structure of the resulting learned behavioural models. We then evaluate the extent to which explanations proposed by the learned models accord with a human observer's interpretation of the behaviour exhibited by the robot in its execution of the task
Recommended from our members
EllipTrack: A Global-Local Cell-Tracking Pipeline for 2D Fluorescence Time-Lapse Microscopy
Time-lapse microscopy provides an unprecedented opportunity to monitor single-cell dynamics. However, tracking cells for long periods remains a technical challenge, especially for multi-day, large-scale movies with rapid cell migration, high cell density, and drug treatments that alter cell morphology/behavior. Here, we present EllipTrack, a global-local cell-tracking pipeline optimized for tracking such movies. EllipTrack first implements a global track-linking algorithm to construct tracks that maximize the probability of cell lineages. Tracking mistakes are then corrected with a local track-correction module in which tracks generated by the global algorithm are systematically examined and amended if a more probable alternative can be found. Through benchmarking, we show that EllipTrack outperforms state-of-the-art cell trackers and generates nearly error-free cell lineages for multiple large-scale movies. In addition, EllipTrack can adapt to time- and cell-density-dependent changes in cell migration speeds and requires minimal training datasets. EllipTrack is available at https://github.com/tianchengzhe/EllipTrack.
</div
Automated Deep Lineage Tree Analysis Using a Bayesian Single Cell Tracking Approach
Single-cell methods are beginning to reveal the intrinsic heterogeneity in cell populations, arising from the interplay of deterministic and stochastic processes. However, it remains challenging to quantify single-cell behaviour from time-lapse microscopy data, owing to the difficulty of extracting reliable cell trajectories and lineage information over long time-scales and across several generations. Therefore, we developed a hybrid deep learning and Bayesian cell tracking approach to reconstruct lineage trees from live-cell microscopy data. We implemented a residual U-Net model coupled with a classification CNN to allow accurate instance segmentation of the cell nuclei. To track the cells over time and through cell divisions, we developed a Bayesian cell tracking methodology that uses input features from the images to enable the retrieval of multi-generational lineage information from a corpus of thousands of hours of live-cell imaging data. Using our approach, we extracted 20,000 + fully annotated single-cell trajectories from over 3,500 h of video footage, organised into multi-generational lineage trees spanning up to eight generations and fourth cousin distances. Benchmarking tests, including lineage tree reconstruction assessments, demonstrate that our approach yields high-fidelity results with our data, with minimal requirement for manual curation. To demonstrate the robustness of our minimally supervised cell tracking methodology, we retrieve cell cycle durations and their extended inter- and intra-generational family relationships in 5,000 + fully annotated cell lineages. We observe vanishing cycle duration correlations across ancestral relatives, yet reveal correlated cyclings between cells sharing the same generation in extended lineages. These findings expand the depth and breadth of investigated cell lineage relationships in approximately two orders of magnitude more data than in previous studies of cell cycle heritability, which were reliant on semi-manual lineage data analysis
A graph-based cell tracking algorithm with few manually tunable parameters and automated segmentation error correction
Automatic cell segmentation and tracking enables to gain quantitative insights into the processes driving cell migration. To investigate new data with minimal manual effort, cell tracking algorithms should be easy to apply and reduce manual curation time by providing automatic correction of segmentation errors. Current cell tracking algorithms, however, are either easy to apply to new data sets but lack automatic segmentation error correction, or have a vast set of parameters that needs either manual tuning or annotated data for parameter tuning. In this work, we propose a tracking algorithm with only few manually tunable parameters and automatic segmentation error correction. Moreover, no training data is needed. We compare the performance of our approach to three well-performing tracking algorithms from the Cell Tracking Challenge on data sets with simulated, degraded segmentationâincluding false negatives, over- and under-segmentation errors. Our tracking algorithm can correct false negatives, over- and under-segmentation errors as well as a mixture of the aforementioned segmentation errors. On data sets with under-segmentation errors or a mixture of segmentation errors our approach performs best. Moreover, without requiring additional manual tuning, our approach ranks several times in the top 3 on the 6(th) edition of the Cell Tracking Challenge
- âŠ