7,501 research outputs found
A Novel Convex Relaxation for Non-Binary Discrete Tomography
We present a novel convex relaxation and a corresponding inference algorithm
for the non-binary discrete tomography problem, that is, reconstructing
discrete-valued images from few linear measurements. In contrast to state of
the art approaches that split the problem into a continuous reconstruction
problem for the linear measurement constraints and a discrete labeling problem
to enforce discrete-valued reconstructions, we propose a joint formulation that
addresses both problems simultaneously, resulting in a tighter convex
relaxation. For this purpose a constrained graphical model is set up and
evaluated using a novel relaxation optimized by dual decomposition. We evaluate
our approach experimentally and show superior solutions both mathematically
(tighter relaxation) and experimentally in comparison to previously proposed
relaxations
An Analysis of Finite Element Approximation in Electrical Impedance Tomography
We present a finite element analysis of electrical impedance tomography for
reconstructing the conductivity distribution from electrode voltage
measurements by means of Tikhonov regularization. Two popular choices of the
penalty term, i.e., -norm smoothness penalty and total variation
seminorm penalty, are considered. A piecewise linear finite element method is
employed for discretizing the forward model, i.e., the complete electrode
model, the conductivity, and the penalty functional. The convergence of the
finite element approximations for the Tikhonov model on both polyhedral and
smooth curved domains is established. This provides rigorous justifications for
the ad hoc discretization procedures in the literature.Comment: 20 page
Post-Reconstruction Deconvolution of PET Images by Total Generalized Variation Regularization
Improving the quality of positron emission tomography (PET) images, affected
by low resolution and high level of noise, is a challenging task in nuclear
medicine and radiotherapy. This work proposes a restoration method, achieved
after tomographic reconstruction of the images and targeting clinical
situations where raw data are often not accessible. Based on inverse problem
methods, our contribution introduces the recently developed total generalized
variation (TGV) norm to regularize PET image deconvolution. Moreover, we
stabilize this procedure with additional image constraints such as positivity
and photometry invariance. A criterion for updating and adjusting automatically
the regularization parameter in case of Poisson noise is also presented.
Experiments are conducted on both synthetic data and real patient images.Comment: First published in the Proceedings of the 23rd European Signal
Processing Conference (EUSIPCO-2015) in 2015, published by EURASI
A parametric level-set method for partially discrete tomography
This paper introduces a parametric level-set method for tomographic
reconstruction of partially discrete images. Such images consist of a
continuously varying background and an anomaly with a constant (known)
grey-value. We represent the geometry of the anomaly using a level-set
function, which we represent using radial basis functions. We pose the
reconstruction problem as a bi-level optimization problem in terms of the
background and coefficients for the level-set function. To constrain the
background reconstruction we impose smoothness through Tikhonov regularization.
The bi-level optimization problem is solved in an alternating fashion; in each
iteration we first reconstruct the background and consequently update the
level-set function. We test our method on numerical phantoms and show that we
can successfully reconstruct the geometry of the anomaly, even from limited
data. On these phantoms, our method outperforms Total Variation reconstruction,
DART and P-DART.Comment: Paper submitted to 20th International Conference on Discrete Geometry
for Computer Imager
Quantum state tomography with non-instantaneous measurements, imperfections and decoherence
Tomography of a quantum state is usually based on positive operator-valued
measure (POVM) and on their experimental statistics. Among the available
reconstructions, the maximum-likelihood (MaxLike) technique is an efficient
one. We propose an extension of this technique when the measurement process
cannot be simply described by an instantaneous POVM. Instead, the tomography
relies on a set of quantum trajectories and their measurement records. This
model includes the fact that, in practice, each measurement could be corrupted
by imperfections and decoherence, and could also be associated with the record
of continuous-time signals over a finite amount of time. The goal is then to
retrieve the quantum state that was present at the start of this measurement
process. The proposed extension relies on an explicit expression of the
likelihood function via the effective matrices appearing in quantum smoothing
and solutions of the adjoint quantum filter. It allows to retrieve the initial
quantum state as in standard MaxLike tomography, but where the traditional POVM
operators are replaced by more general ones that depend on the measurement
record of each trajectory. It also provides, aside the MaxLike estimate of the
quantum state, confidence intervals for any observable. Such confidence
intervals are derived, as the MaxLike estimate, from an asymptotic expansion of
multi-dimensional Laplace integrals appearing in Bayesian Mean estimation. A
validation is performed on two sets of experimental data: photon(s) trapped in
a microwave cavity subject to quantum non-demolition measurements relying on
Rydberg atoms; heterodyne fluorescence measurements of a superconducting qubit.Comment: 11 pages, 4 figures, submitte
- …