1,361 research outputs found
Globally Optimal Cell Tracking using Integer Programming
We propose a novel approach to automatically tracking cell populations in
time-lapse images. To account for cell occlusions and overlaps, we introduce a
robust method that generates an over-complete set of competing detection
hypotheses. We then perform detection and tracking simultaneously on these
hypotheses by solving to optimality an integer program with only one type of
flow variables. This eliminates the need for heuristics to handle missed
detections due to occlusions and complex morphology. We demonstrate the
effectiveness of our approach on a range of challenging sequences consisting of
clumped cells and show that it outperforms state-of-the-art techniques.Comment: Engin T\"uretken and Xinchao Wang contributed equally to this wor
Generative modeling of living cells with SO(3)-equivariant implicit neural representations
Data-driven cell tracking and segmentation methods in biomedical imaging
require diverse and information-rich training data. In cases where the number
of training samples is limited, synthetic computer-generated data sets can be
used to improve these methods. This requires the synthesis of cell shapes as
well as corresponding microscopy images using generative models. To synthesize
realistic living cell shapes, the shape representation used by the generative
model should be able to accurately represent fine details and changes in
topology, which are common in cells. These requirements are not met by 3D voxel
masks, which are restricted in resolution, and polygon meshes, which do not
easily model processes like cell growth and mitosis. In this work, we propose
to represent living cell shapes as level sets of signed distance functions
(SDFs) which are estimated by neural networks. We optimize a fully-connected
neural network to provide an implicit representation of the SDF value at any
point in a 3D+time domain, conditioned on a learned latent code that is
disentangled from the rotation of the cell shape. We demonstrate the
effectiveness of this approach on cells that exhibit rapid deformations
(Platynereis dumerilii), cells that grow and divide (C. elegans), and cells
that have growing and branching filopodial protrusions (A549 human lung
carcinoma cells). A quantitative evaluation using shape features, Hausdorff
distance, and Dice similarity coefficients of real and synthetic cell shapes
shows that our model can generate topologically plausible complex cell shapes
in 3D+time with high similarity to real living cell shapes. Finally, we show
how microscopy images of living cells that correspond to our generated cell
shapes can be synthesized using an image-to-image model.Comment: Medical Image Analysis 2023 (Submitted
Methods for Spatio-Temporal Analysis of Embryo Cleavage In Vitro
Automated or semiautomated time-lapse analysis of early stage embryo images during the cleavage stage can give insight into the timing of mitosis, regularity of both division timing and pattern, as well as cell lineage. Simultaneous monitoring of molecular processes enables the study of connections between genetic expression and cell physiology and development. The study of live embryos poses not only new requirements on the hardware and embryo-holding equipment but also indirectly on analytical software and data analysis as four-dimensional video sequencing of embryos easily creates high quantities of data. The ability to continuously film and automatically analyze growing embryos gives new insights into temporal embryo development by studying morphokinetics as well as morphology. Until recently, this was not possible unless by a tedious manual process. In recent years, several methods have been developed that enable this dynamic monitoring of live embryos. Here we describe three methods with variations in hardware and software analysis and give examples of the outcomes. Together, these methods open a window to new information in developmental embryology, as embryo division pattern and lineage are studied in vivo
Multi-StyleGAN: Towards Image-Based Simulation of Time-Lapse Live-Cell Microscopy
Time-lapse fluorescent microscopy (TLFM) combined with predictive
mathematical modelling is a powerful tool to study the inherently dynamic
processes of life on the single-cell level. Such experiments are costly,
complex and labour intensive. A complimentary approach and a step towards in
silico experimentation, is to synthesise the imagery itself. Here, we propose
Multi-StyleGAN as a descriptive approach to simulate time-lapse fluorescence
microscopy imagery of living cells, based on a past experiment. This novel
generative adversarial network synthesises a multi-domain sequence of
consecutive timesteps. We showcase Multi-StyleGAN on imagery of multiple live
yeast cells in microstructured environments and train on a dataset recorded in
our laboratory. The simulation captures underlying biophysical factors and time
dependencies, such as cell morphology, growth, physical interactions, as well
as the intensity of a fluorescent reporter protein. An immediate application is
to generate additional training and validation data for feature extraction
algorithms or to aid and expedite development of advanced experimental
techniques such as online monitoring or control of cells.
Code and dataset is available at
https://git.rwth-aachen.de/bcs/projects/tp/multi-stylegan.Comment: revised -- accepted to MICCAI 2021. (Tim Prangemeier and Christoph
Reich --- both authors contributed equally
- …