Recent advances in imaging and high-performance computing have made it
possible to image the entire human brain at the cellular level. This is the
basis to study the multi-scale architecture of the brain regarding its
subdivision into brain areas and nuclei, cortical layers, columns, and cell
clusters down to single cell morphology Methods for brain mapping and cell
segmentation exploit such images to enable rapid and automated analysis of
cytoarchitecture and cell distribution in complete series of histological
sections. However, the presence of inevitable processing artifacts in the image
data caused by missing sections, tears in the tissue, or staining variations
remains the primary reason for gaps in the resulting image data. To this end we
aim to provide a model that can fill in missing information in a reliable way,
following the true cell distribution at different scales. Inspired by the
recent success in image generation, we propose a denoising diffusion
probabilistic model (DDPM), trained on light-microscopic scans of cell-body
stained sections. We extend this model with the RePaint method to impute
missing or replace corrupted image data. We show that our trained DDPM is able
to generate highly realistic image information for this purpose, generating
plausible cell statistics and cytoarchitectonic patterns. We validate its
outputs using two established downstream task models trained on the same data.Comment: Submitted to ISBI-202