423 research outputs found
Optimization with Sparsity-Inducing Penalties
Sparse estimation methods are aimed at using or obtaining parsimonious
representations of data or models. They were first dedicated to linear variable
selection but numerous extensions have now emerged such as structured sparsity
or kernel selection. It turns out that many of the related estimation problems
can be cast as convex optimization problems by regularizing the empirical risk
with appropriate non-smooth norms. The goal of this paper is to present from a
general perspective optimization tools and techniques dedicated to such
sparsity-inducing penalties. We cover proximal methods, block-coordinate
descent, reweighted -penalized techniques, working-set and homotopy
methods, as well as non-convex formulations and extensions, and provide an
extensive set of experiments to compare various algorithms from a computational
point of view
A mixed regularization approach for sparse simultaneous approximation of parameterized PDEs
We present and analyze a novel sparse polynomial technique for the
simultaneous approximation of parameterized partial differential equations
(PDEs) with deterministic and stochastic inputs. Our approach treats the
numerical solution as a jointly sparse reconstruction problem through the
reformulation of the standard basis pursuit denoising, where the set of jointly
sparse vectors is infinite. To achieve global reconstruction of sparse
solutions to parameterized elliptic PDEs over both physical and parametric
domains, we combine the standard measurement scheme developed for compressed
sensing in the context of bounded orthonormal systems with a novel mixed-norm
based regularization method that exploits both energy and sparsity. In
addition, we are able to prove that, with minimal sample complexity, error
estimates comparable to the best -term and quasi-optimal approximations are
achievable, while requiring only a priori bounds on polynomial truncation error
with respect to the energy norm. Finally, we perform extensive numerical
experiments on several high-dimensional parameterized elliptic PDE models to
demonstrate the superior recovery properties of the proposed approach.Comment: 23 pages, 4 figure
Proceedings of the second "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'14)
The implicit objective of the biennial "international - Traveling Workshop on
Interactions between Sparse models and Technology" (iTWIST) is to foster
collaboration between international scientific teams by disseminating ideas
through both specific oral/poster presentations and free discussions. For its
second edition, the iTWIST workshop took place in the medieval and picturesque
town of Namur in Belgium, from Wednesday August 27th till Friday August 29th,
2014. The workshop was conveniently located in "The Arsenal" building within
walking distance of both hotels and town center. iTWIST'14 has gathered about
70 international participants and has featured 9 invited talks, 10 oral
presentations, and 14 posters on the following themes, all related to the
theory, application and generalization of the "sparsity paradigm":
Sparsity-driven data sensing and processing; Union of low dimensional
subspaces; Beyond linear and convex inverse problem; Matrix/manifold/graph
sensing/processing; Blind inverse problems and dictionary learning; Sparsity
and computational neuroscience; Information theory, geometry and randomness;
Complexity/accuracy tradeoffs in numerical methods; Sparsity? What's next?;
Sparse machine learning and inference.Comment: 69 pages, 24 extended abstracts, iTWIST'14 website:
http://sites.google.com/site/itwist1
- …