180 research outputs found
Review : Deep learning in electron microscopy
Deep learning is transforming most areas of science and technology, including electron microscopy. This review paper offers a practical perspective aimed at developers with limited familiarity. For context, we review popular applications of deep learning in electron microscopy. Following, we discuss hardware and software needed to get started with deep learning and interface with electron microscopes. We then review neural network components, popular architectures, and their optimization. Finally, we discuss future directions of deep learning in electron microscopy
Recommended from our members
Single atom imaging with time-resolved electron microscopy
Developments in scanning transmission electron microscopy (STEM) have opened
up new possibilities for time-resolved imaging at the atomic scale. However, rapid
imaging of single atom dynamics brings with it a new set of challenges, particularly
regarding noise and the interaction between the electron beam and the specimen. This
thesis develops a set of analytical tools for capturing atomic motion and analyzing the
dynamic behaviour of materials at the atomic scale.
Machine learning is increasingly playing an important role in the analysis of electron
microscopy data. In this light, new unsupervised learning tools are developed here for
noise removal under low-dose imaging conditions and for identifying the motion of
surface atoms. The scope for real-time processing and analysis is also explored, which is
of rising importance as electron microscopy datasets grow in size and complexity.
These advances in image processing and analysis are combined with computational
modelling to uncover new chemical and physical insights into the motion of atoms
adsorbed onto surfaces. Of particular interest are systems for heterogeneous catalysis,
where the catalytic activity can depend intimately on the atomic environment. The
study of Cu atoms on a graphene oxide support reveals that the atoms undergo
anomalous diffusion as a result of spatial and energetic disorder present in the substrate.
The investigation is extended to examine the structure and stability of small Cu clusters
on graphene oxide, with atomistic modelling used to understand the significant role
played by the substrate. Finally, the analytical methods are used to study the surface
reconstruction of silicon alongside the electron beam-induced motion of adatoms on
the surface.
Taken together, these studies demonstrate the materials insights that can be obtained
with time-resolved STEM imaging, and highlight the importance of combining state-ofthe-
art imaging with computational analysis and atomistic modelling to quantitatively
characterize the behaviour of materials with atomic resolution.The research leading to these results has received funding from the European Research Council under the European Union's Seventh Framework Programme (FP7/2007–2013)/ERC grant agreement 291522–3DIMAGE, as well as from the European Union Seventh Framework Programme under Grant Agreement 312483-ESTEEM2 (Integrated Infrastructure Initiative -I3)
The SURE-LET approach to image denoising
Denoising is an essential step prior to any higher-level image-processing tasks such as segmentation or object tracking, because the undesirable corruption by noise is inherent to any physical acquisition device. When the measurements are performed by photosensors, one usually distinguish between two main regimes: in the first scenario, the measured intensities are sufficiently high and the noise is assumed to be signal-independent. In the second scenario, only few photons are detected, which leads to a strong signal-dependent degradation. When the noise is considered as signal-independent, it is often modeled as an additive independent (typically Gaussian) random variable, whereas, otherwise, the measurements are commonly assumed to follow independent Poisson laws, whose underlying intensities are the unknown noise-free measures. We first consider the reduction of additive white Gaussian noise (AWGN). Contrary to most existing denoising algorithms, our approach does not require an explicit prior statistical modeling of the unknown data. Our driving principle is the minimization of a purely data-adaptive unbiased estimate of the mean-squared error (MSE) between the processed and the noise-free data. In the AWGN case, such a MSE estimate was first proposed by Stein, and is known as "Stein's unbiased risk estimate" (SURE). We further develop the original SURE theory and propose a general methodology for fast and efficient multidimensional image denoising, which we call the SURE-LET approach. While SURE allows the quantitative monitoring of the denoising quality, the flexibility and the low computational complexity of our approach are ensured by a linear parameterization of the denoising process, expressed as a linear expansion of thresholds (LET).We propose several pointwise, multivariate, and multichannel thresholding functions applied to arbitrary (in particular, redundant) linear transformations of the input data, with a special focus on multiscale signal representations. We then transpose the SURE-LET approach to the estimation of Poisson intensities degraded by AWGN. The signal-dependent specificity of the Poisson statistics leads to the derivation of a new unbiased MSE estimate that we call "Poisson's unbiased risk estimate" (PURE) and requires more adaptive transform-domain thresholding rules. In a general PURE-LET framework, we first devise a fast interscale thresholding method restricted to the use of the (unnormalized) Haar wavelet transform. We then lift this restriction and show how the PURE-LET strategy can be used to design and optimize a wide class of nonlinear processing applied in an arbitrary (in particular, redundant) transform domain. We finally apply some of the proposed denoising algorithms to real multidimensional fluorescence microscopy images. Such in vivo imaging modality often operates under low-illumination conditions and short exposure time; consequently, the random fluctuations of the measured fluorophore radiations are well described by a Poisson process degraded (or not) by AWGN. We validate experimentally this statistical measurement model, and we assess the performance of the PURE-LET algorithms in comparison with some state-of-the-art denoising methods. Our solution turns out to be very competitive both qualitatively and computationally, allowing for a fast and efficient denoising of the huge volumes of data that are nowadays routinely produced in biomedical imaging
Adorym: A multi-platform generic x-ray image reconstruction framework based on automatic differentiation
We describe and demonstrate an optimization-based x-ray image reconstruction
framework called Adorym. Our framework provides a generic forward model,
allowing one code framework to be used for a wide range of imaging methods
ranging from near-field holography to and fly-scan ptychographic tomography. By
using automatic differentiation for optimization, Adorym has the flexibility to
refine experimental parameters including probe positions, multiple hologram
alignment, and object tilts. It is written with strong support for parallel
processing, allowing large datasets to be processed on high-performance
computing systems. We demonstrate its use on several experimental datasets to
show improved image quality through parameter refinement
Recommended from our members
Mathematical Challenges in Electron Microscopy
Development of electron microscopes first started nearly 100 years ago and they are now a mature imaging modality with many applications and vast potential for the future. The principal feature of electron microscopes is their resolution; they can be up to 1000 times more powerful than a visible light microscope and resolve even the smallest atoms. Furthermore, electron microscopes are also sensitive to many material properties due to the very rich interactions between electrons and other matter. Because of these capabilities, electron microscopy is used in applications as diverse as drug discovery, computer chip manufacture, and the development of solar cells.
In parallel to this, the mathematical field of inverse problems has also evolved dramatically. Many new methods have been introduced to improve the recovery of unknown structures from indirect data, typically an ill-posed problem. In particular, sparsity promoting functionals such as the total variation and its extensions have been shown to be very powerful for recovering accurate physical quantities from very little and/or poor quality data. While sparsity-promoting reconstruction methods are powerful, they can also be slow, especially in a big-data setting. This trade-off forms an eternal cycle as new numerical tools are found and more powerful models are developed.
The work presented in this thesis aims to marry the tools of inverse problems with the problems of electron microscopy: bringing state-of-the-art image processing techniques to bear on challenges specific to electron microscopy, developing new optimisation methods for these problems, and modelling new inverse problems to extend the capabilities of existing microscopes. One focus is the application of a directional total variation to overcome the limited angle problem in electron tomography, another is the proposal of a new inverse problem for the reconstruction of 3D strain tensor fields from electron microscopy diffraction data. The remaining contributions target numerical aspects of inverse problems, from new algorithms for non-convex problems to convex optimisation with adaptive meshes.Cantab Capital Institute for Mathematics of Informatio
- …