43,806 research outputs found
Optimization with Sparsity-Inducing Penalties
Sparse estimation methods are aimed at using or obtaining parsimonious
representations of data or models. They were first dedicated to linear variable
selection but numerous extensions have now emerged such as structured sparsity
or kernel selection. It turns out that many of the related estimation problems
can be cast as convex optimization problems by regularizing the empirical risk
with appropriate non-smooth norms. The goal of this paper is to present from a
general perspective optimization tools and techniques dedicated to such
sparsity-inducing penalties. We cover proximal methods, block-coordinate
descent, reweighted -penalized techniques, working-set and homotopy
methods, as well as non-convex formulations and extensions, and provide an
extensive set of experiments to compare various algorithms from a computational
point of view
The other War on Terror revealed: global governmentality and the Financial Action Task Force's campaign against terrorist financing
Abstract. Despite initial fanfare surrounding its launch in the White House Rose Garden, the
War on Terrorist Finances (WOTF) has thus far languished as a sideshow, in the shadows of
military campaigns against terrorism in Afghanistan and Iraq. This neglect is unfortunate, for
the WOTF reflects the other multilateral cooperative dimension of the US-led âwar on terrorâ,
quite contrary to conventional sweeping accusations of American unilateralism. Yet the
existing academic literature has been confined mostly to niche specialist journals dedicated to
technical, legalistic and financial regulatory aspects of the WOTF. Using the Financial Action
Task Force (FATF) as a case study, this article seeks to steer discussions on the WOTF onto
a broader theoretical IR perspective. Building upon emerging academic works that extend
Foucauldian ideas of governmentality to the global level, we examine the interwoven
overlapping national, regional and global regulatory practices emerging against terrorist
financing, and the implications for notions of government, regulation and sovereignty
Oracle Inequalities and Optimal Inference under Group Sparsity
We consider the problem of estimating a sparse linear regression vector
under a gaussian noise model, for the purpose of both prediction and
model selection. We assume that prior knowledge is available on the sparsity
pattern, namely the set of variables is partitioned into prescribed groups,
only few of which are relevant in the estimation process. This group sparsity
assumption suggests us to consider the Group Lasso method as a means to
estimate . We establish oracle inequalities for the prediction and
estimation errors of this estimator. These bounds hold under a
restricted eigenvalue condition on the design matrix. Under a stronger
coherence condition, we derive bounds for the estimation error for mixed
-norms with . When , this result implies
that a threshold version of the Group Lasso estimator selects the sparsity
pattern of with high probability. Next, we prove that the rate of
convergence of our upper bounds is optimal in a minimax sense, up to a
logarithmic factor, for all estimators over a class of group sparse vectors.
Furthermore, we establish lower bounds for the prediction and
estimation errors of the usual Lasso estimator. Using this result, we
demonstrate that the Group Lasso can achieve an improvement in the prediction
and estimation properties as compared to the Lasso.Comment: 37 page
- âŠ