3,927 research outputs found
(k,q)-Compressed Sensing for dMRI with Joint Spatial-Angular Sparsity Prior
Advanced diffusion magnetic resonance imaging (dMRI) techniques, like
diffusion spectrum imaging (DSI) and high angular resolution diffusion imaging
(HARDI), remain underutilized compared to diffusion tensor imaging because the
scan times needed to produce accurate estimations of fiber orientation are
significantly longer. To accelerate DSI and HARDI, recent methods from
compressed sensing (CS) exploit a sparse underlying representation of the data
in the spatial and angular domains to undersample in the respective k- and
q-spaces. State-of-the-art frameworks, however, impose sparsity in the spatial
and angular domains separately and involve the sum of the corresponding sparse
regularizers. In contrast, we propose a unified (k,q)-CS formulation which
imposes sparsity jointly in the spatial-angular domain to further increase
sparsity of dMRI signals and reduce the required subsampling rate. To
efficiently solve this large-scale global reconstruction problem, we introduce
a novel adaptation of the FISTA algorithm that exploits dictionary
separability. We show on phantom and real HARDI data that our approach achieves
significantly more accurate signal reconstructions than the state of the art
while sampling only 2-4% of the (k,q)-space, allowing for the potential of new
levels of dMRI acceleration.Comment: To be published in the 2017 Computational Diffusion MRI Workshop of
MICCA
Proceedings of the second "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'14)
The implicit objective of the biennial "international - Traveling Workshop on
Interactions between Sparse models and Technology" (iTWIST) is to foster
collaboration between international scientific teams by disseminating ideas
through both specific oral/poster presentations and free discussions. For its
second edition, the iTWIST workshop took place in the medieval and picturesque
town of Namur in Belgium, from Wednesday August 27th till Friday August 29th,
2014. The workshop was conveniently located in "The Arsenal" building within
walking distance of both hotels and town center. iTWIST'14 has gathered about
70 international participants and has featured 9 invited talks, 10 oral
presentations, and 14 posters on the following themes, all related to the
theory, application and generalization of the "sparsity paradigm":
Sparsity-driven data sensing and processing; Union of low dimensional
subspaces; Beyond linear and convex inverse problem; Matrix/manifold/graph
sensing/processing; Blind inverse problems and dictionary learning; Sparsity
and computational neuroscience; Information theory, geometry and randomness;
Complexity/accuracy tradeoffs in numerical methods; Sparsity? What's next?;
Sparse machine learning and inference.Comment: 69 pages, 24 extended abstracts, iTWIST'14 website:
http://sites.google.com/site/itwist1
Fast Image Recovery Using Variable Splitting and Constrained Optimization
We propose a new fast algorithm for solving one of the standard formulations
of image restoration and reconstruction which consists of an unconstrained
optimization problem where the objective includes an data-fidelity
term and a non-smooth regularizer. This formulation allows both wavelet-based
(with orthogonal or frame-based representations) regularization or
total-variation regularization. Our approach is based on a variable splitting
to obtain an equivalent constrained optimization formulation, which is then
addressed with an augmented Lagrangian method. The proposed algorithm is an
instance of the so-called "alternating direction method of multipliers", for
which convergence has been proved. Experiments on a set of image restoration
and reconstruction benchmark problems show that the proposed algorithm is
faster than the current state of the art methods.Comment: Submitted; 11 pages, 7 figures, 6 table
Separable Cosparse Analysis Operator Learning
The ability of having a sparse representation for a certain class of signals
has many applications in data analysis, image processing, and other research
fields. Among sparse representations, the cosparse analysis model has recently
gained increasing interest. Many signals exhibit a multidimensional structure,
e.g. images or three-dimensional MRI scans. Most data analysis and learning
algorithms use vectorized signals and thereby do not account for this
underlying structure. The drawback of not taking the inherent structure into
account is a dramatic increase in computational cost. We propose an algorithm
for learning a cosparse Analysis Operator that adheres to the preexisting
structure of the data, and thus allows for a very efficient implementation.
This is achieved by enforcing a separable structure on the learned operator.
Our learning algorithm is able to deal with multidimensional data of arbitrary
order. We evaluate our method on volumetric data at the example of
three-dimensional MRI scans.Comment: 5 pages, 3 figures, accepted at EUSIPCO 201
Precise Phase Transition of Total Variation Minimization
Characterizing the phase transitions of convex optimizations in recovering
structured signals or data is of central importance in compressed sensing,
machine learning and statistics. The phase transitions of many convex
optimization signal recovery methods such as minimization and nuclear
norm minimization are well understood through recent years' research. However,
rigorously characterizing the phase transition of total variation (TV)
minimization in recovering sparse-gradient signal is still open. In this paper,
we fully characterize the phase transition curve of the TV minimization. Our
proof builds on Donoho, Johnstone and Montanari's conjectured phase transition
curve for the TV approximate message passing algorithm (AMP), together with the
linkage between the minmax Mean Square Error of a denoising problem and the
high-dimensional convex geometry for TV minimization.Comment: 6 page
- …