7,822 research outputs found
Robust phase retrieval with the swept approximate message passing (prSAMP) algorithm
In phase retrieval, the goal is to recover a complex signal from the
magnitude of its linear measurements. While many well-known algorithms
guarantee deterministic recovery of the unknown signal using i.i.d. random
measurement matrices, they suffer serious convergence issues some
ill-conditioned matrices. As an example, this happens in optical imagers using
binary intensity-only spatial light modulators to shape the input wavefront.
The problem of ill-conditioned measurement matrices has also been a topic of
interest for compressed sensing researchers during the past decade. In this
paper, using recent advances in generic compressed sensing, we propose a new
phase retrieval algorithm that well-adopts for both Gaussian i.i.d. and binary
matrices using both sparse and dense input signals. This algorithm is also
robust to the strong noise levels found in some imaging applications
Four-dimensional tomographic reconstruction by time domain decomposition
Since the beginnings of tomography, the requirement that the sample does not
change during the acquisition of one tomographic rotation is unchanged. We
derived and successfully implemented a tomographic reconstruction method which
relaxes this decades-old requirement of static samples. In the presented
method, dynamic tomographic data sets are decomposed in the temporal domain
using basis functions and deploying an L1 regularization technique where the
penalty factor is taken for spatial and temporal derivatives. We implemented
the iterative algorithm for solving the regularization problem on modern GPU
systems to demonstrate its practical use
Compression and Conditional Emulation of Climate Model Output
Numerical climate model simulations run at high spatial and temporal
resolutions generate massive quantities of data. As our computing capabilities
continue to increase, storing all of the data is not sustainable, and thus it
is important to develop methods for representing the full datasets by smaller
compressed versions. We propose a statistical compression and decompression
algorithm based on storing a set of summary statistics as well as a statistical
model describing the conditional distribution of the full dataset given the
summary statistics. The statistical model can be used to generate realizations
representing the full dataset, along with characterizations of the
uncertainties in the generated data. Thus, the methods are capable of both
compression and conditional emulation of the climate models. Considerable
attention is paid to accurately modeling the original dataset--one year of
daily mean temperature data--particularly with regard to the inherent spatial
nonstationarity in global fields, and to determining the statistics to be
stored, so that the variation in the original data can be closely captured,
while allowing for fast decompression and conditional emulation on modest
computers
Enhancing the performance of Decoupled Software Pipeline through Backward Slicing
The rapidly increasing number of cores available in multicore processors does
not necessarily lead directly to a commensurate increase in performance:
programs written in conventional languages, such as C, need careful
restructuring, preferably automatically, before the benefits can be observed in
improved run-times. Even then, much depends upon the intrinsic capacity of the
original program for concurrent execution. The subject of this paper is the
performance gains from the combined effect of the complementary techniques of
the Decoupled Software Pipeline (DSWP) and (backward) slicing. DSWP extracts
threadlevel parallelism from the body of a loop by breaking it into stages
which are then executed pipeline style: in effect cutting across the control
chain. Slicing, on the other hand, cuts the program along the control chain,
teasing out finer threads that depend on different variables (or locations).
parts that depend on different variables. The main contribution of this paper
is to demonstrate that the application of DSWP, followed by slicing offers
notable improvements over DSWP alone, especially when there is a loop-carried
dependence that prevents the application of the simpler DOALL optimization.
Experimental results show an improvement of a factor of ?1.6 for DSWP + slicing
over DSWP alone and a factor of ?2.4 for DSWP + slicing over the original
sequential code
Non-convex image reconstruction via Expectation Propagation
Tomographic image reconstruction can be mapped to a problem of finding
solutions to a large system of linear equations which maximize a function that
includes \textit{a priori} knowledge regarding features of typical images such
as smoothness or sharpness. This maximization can be performed with standard
local optimization tools when the function is concave, but it is generally
intractable for realistic priors, which are non-concave. We introduce a new
method to reconstruct images obtained from Radon projections by using
Expectation Propagation, which allows us to reframe the problem from an
Bayesian inference perspective. We show, by means of extensive simulations,
that, compared to state-of-the-art algorithms for this task, Expectation
Propagation paired with very simple but non log-concave priors, is often able
to reconstruct images up to a smaller error while using a lower amount of
information per pixel. We provide estimates for the critical rate of
information per pixel above which recovery is error-free by means of
simulations on ensembles of phantom and real images.Comment: 12 pages, 6 figure
- …