121 research outputs found
Controlled wavelet domain sparsity for x-ray tomography
Tomographic reconstruction is an ill-posed inverse problem that calls for regularization. One possibility is to require sparsity of the unknown in an orthonormal wavelet basis. This, in turn, can be achieved by variational regularization, where the penalty term is the sum of the absolute values of the wavelet coefficients. The primal-dual fixed point algorithm showed that the minimizer of the variational regularization functional can be computed iteratively using a soft-thresholding operation. Choosing the soft-thresholding parameter mu > 0 is analogous to the notoriously difficult problem of picking the optimal regularization parameter in Tikhonov regularization. Here, a novel automatic method is introduced for choosing mu, based on a control algorithm driving the sparsity of the reconstruction to an a priori known ratio of nonzero versus zero wavelet coefficients in the unknown.Peer reviewe
Spatiotemporal Besov Priors for Bayesian Inverse Problems
Fast development in science and technology has driven the need for proper
statistical tools to capture special data features such as abrupt changes or
sharp contrast. Many applications in the data science seek spatiotemporal
reconstruction from a sequence of time-dependent objects with discontinuity or
singularity, e.g. dynamic computerized tomography (CT) images with edges.
Traditional methods based on Gaussian processes (GP) may not provide
satisfactory solutions since they tend to offer over-smooth prior candidates.
Recently, Besov process (BP) defined by wavelet expansions with random
coefficients has been proposed as a more appropriate prior for this type of
Bayesian inverse problems. While BP outperforms GP in imaging analysis to
produce edge-preserving reconstructions, it does not automatically incorporate
temporal correlation inherited in the dynamically changing images. In this
paper, we generalize BP to the spatiotemporal domain (STBP) by replacing the
random coefficients in the series expansion with stochastic time functions
following Q-exponential process which governs the temporal correlation
strength. Mathematical and statistical properties about STBP are carefully
studied. A white-noise representation of STBP is also proposed to facilitate
the point estimation through maximum a posterior (MAP) and the uncertainty
quantification (UQ) by posterior sampling. Two limited-angle CT reconstruction
examples and a highly non-linear inverse problem involving Navier-Stokes
equation are used to demonstrate the advantage of the proposed STBP in
preserving spatial features while accounting for temporal changes compared with
the classic STGP and a time-uncorrelated approach.Comment: 29 pages, 13 figure
Bayesian inversion in biomedical imaging
Biomedizinische Bildgebung ist zu einer Schlüsseltechnik geworden, Struktur oder Funktion lebender Organismen nicht-invasiv zu untersuchen. Relevante Informationen aus den gemessenen Daten zu rekonstruieren erfordert neben mathematischer Modellierung und numerischer Simulation das verlässliche Lösen schlecht gestellter inverser Probleme. Um dies zu erreichen müssen zusätzliche a-priori Informationen über die zu rekonstruierende Größe formuliert und in die algorithmischen Lösungsverfahren einbezogen werden. Bayesianische Invertierung ist eine spezielle mathematische Methodik dies zu tun. Die vorliegende Arbeit entwickelt eine aktuelle Übersicht Bayesianischer Invertierung und demonstriert die vorgestellten Konzepte und Algorithmen in verschiedenen numerischen Studien, darunter anspruchsvolle Anwendungen aus der biomedizinischen Bildgebung mit experimentellen Daten. Ein Schwerpunkt liegt dabei auf der Verwendung von Dünnbesetztheit/Sparsity als a-priori Information.Biomedical imaging techniques became a key technology to assess the structure or function of living organisms in a non-invasive way. Besides innovations in the instrumentation, the development of new and improved methods for processing and analysis of the measured data has become a vital field of research. Building on traditional signal processing, this area nowadays also comprises mathematical modeling, numerical simulation and inverse problems. The latter describes the reconstruction of quantities of interest from measured data and a given generative model. Unfortunately, most inverse problems are ill-posed, which means that a robust and reliable reconstruction is not possible unless additional a-priori information on the quantity of interest is incorporated into the solution method. Bayesian inversion is a mathematical methodology to formulate and employ a-priori information in computational schemes to solve the inverse problem. This thesis develops a recent overview on Bayesian inversion and exemplifies the presented concepts and algorithms in various numerical studies including challenging biomedical imaging applications with experimental data. A particular focus is on using sparsity as a-priori information within the Bayesian framework. <br
Advances in Trans-dimensional Geophysical Inference
This research presents a series of novel Bayesian
trans-dimensional
methods for geophysical inversion. A first example illustrates
how
Bayesian prior information obtained from theory and numerical
experiments can be used to better inform a difficult
multi-modal inversion of dispersion information from empirical
Greens
functions obtained from ambient noise cross-correlation. This
approach
is an extension of existing partition modeling schemes.
An entirely new class of trans-dimensional algorithm, called the
trans-dimensional tree method is introduced. This new method is
shown
to be more efficient at coupling to a forward model, more
efficient at
convergence, and more adaptable to different dimensions and
geometries
than existing approaches. The efficiency and flexibility of the
trans-dimensional tree method is demonstrated in two different
examples: (1) airborne electromagnetic tomography (AEM) in a 2D
transect inversion, and (2) a fully non-linear inversion of
ambient
noise tomography. In this latter example the resolution at depth
has
been significantly improved by inverting a contiguous band of
frequencies jointly rather than as independent phase velocity
maps,
allowing new insights into crustal architecture beneath Iceland.
In a first test case for even larger scale problems, an
application of
the trans-dimensional tree approach to large global data set is
presented. A global database of nearly 5 million multi-model
path
average Rayleigh wave phase velocity observations has been used
to
construct global phase velocity maps. Results are comparable to
existing published phase velocity maps, however, as the
trans-dimensional approach adapts the resolution appropriate to
the
data, rather than imposing damping or smoothing constraints to
stabilize the inversion, the recovered anomaly magnitudes are
generally higher with low uncertainties. While further
investigation is
needed, this early test case shows that trans-dimensional
sampling can
be applied to global scale seismology problems and that previous
analyses may, in some locales, under estimate the heterogeneity
of the
Earth.
Finally, in a further advancement of partition modelling with
variable
order polynomials, a new method has been developed called
trans-dimensional spectral elements. Previous applications
involving
variable order polynomials have used polynomials that are both
difficult to work with in a Bayesian framework and unstable at
higher orders. By using the orthogonal polynomials typically used
in
modern full-waveform solvers, the useful properties of this type
of
polynomial and its application in trans-dimensional inversion
are
demonstrated. Additionally, these polynomials can be directly
used in
complex differential solvers and an example of this for 1D
inversion
of surface wave dispersion curves is given
Bayesian Variational Regularisation for Dark Matter Reconstruction with Uncertainty Quantification
Despite the great wealth of cosmological knowledge accumulated since the early 20th century, the nature of dark-matter, which accounts for ~85% of the matter content of the universe, remains illusive. Unfortunately, though dark-matter is scientifically interesting, with implications for our fundamental understanding of the Universe, it cannot be directly observed. Instead, dark-matter may be inferred from e.g. the optical distortion (lensing) of distant galaxies which, at linear order, manifests as a perturbation to the apparent magnitude (convergence) and ellipticity (shearing). Ensemble observations of the shear are collected and leveraged to construct estimates of the convergence, which can directly be related to the universal dark-matter distribution. Imminent stage IV surveys are forecast to accrue an unprecedented quantity of cosmological information; a discriminative partition of which is accessible through the convergence, and is disproportionately concentrated at high angular resolutions, where the echoes of cosmological evolution under gravity are most apparent. Capitalising on advances in probability concentration theory, this thesis merges the paradigms of Bayesian inference and optimisation to develop hybrid convergence inference techniques which are scalable, statistically principled, and operate over the Euclidean plane, celestial sphere, and 3-dimensional ball. Such techniques can quantify the plausibility of inferences at one-millionth the computational overhead of competing sampling methods. These Bayesian techniques are applied to the hotly debated Abell-520 merging cluster, concluding that observational catalogues contain insufficient information to determine the existence of dark-matter self-interactions. Further, these techniques were applied to all public lensing catalogues, recovering the then largest global dark-matter mass-map. The primary methodological contributions of this thesis depend only on posterior log-concavity, paving the way towards a, potentially revolutionary, complete hybridisation with artificial intelligence techniques. These next-generation techniques are the first to operate over the full 3-dimensional ball, laying the foundations for statistically principled universal dark-matter cartography, and the cosmological insights such advances may provide
A Survey on Deep Learning in Medical Image Analysis
Deep learning algorithms, in particular convolutional networks, have rapidly
become a methodology of choice for analyzing medical images. This paper reviews
the major deep learning concepts pertinent to medical image analysis and
summarizes over 300 contributions to the field, most of which appeared in the
last year. We survey the use of deep learning for image classification, object
detection, segmentation, registration, and other tasks and provide concise
overviews of studies per application area. Open challenges and directions for
future research are discussed.Comment: Revised survey includes expanded discussion section and reworked
introductory section on common deep architectures. Added missed papers from
before Feb 1st 201
Deep Learning in Cardiology
The medical field is creating large amount of data that physicians are unable
to decipher and use efficiently. Moreover, rule-based expert systems are
inefficient in solving complicated medical tasks or for creating insights using
big data. Deep learning has emerged as a more accurate and effective technology
in a wide range of medical problems such as diagnosis, prediction and
intervention. Deep learning is a representation learning method that consists
of layers that transform the data non-linearly, thus, revealing hierarchical
relationships and structures. In this review we survey deep learning
application papers that use structured data, signal and imaging modalities from
cardiology. We discuss the advantages and limitations of applying deep learning
in cardiology that also apply in medicine in general, while proposing certain
directions as the most viable for clinical use.Comment: 27 pages, 2 figures, 10 table
- …