2,037 research outputs found

    A quasi-Newton proximal splitting method

    Get PDF
    A new result in convex analysis on the calculation of proximity operators in certain scaled norms is derived. We describe efficient implementations of the proximity calculation for a useful class of functions; the implementations exploit the piece-wise linear nature of the dual problem. The second part of the paper applies the previous result to acceleration of convex minimization problems, and leads to an elegant quasi-Newton method. The optimization method compares favorably against state-of-the-art alternatives. The algorithm has extensive applications including signal processing, sparse recovery and machine learning and classification

    Iteration-Complexity of a Generalized Forward Backward Splitting Algorithm

    Full text link
    In this paper, we analyze the iteration-complexity of Generalized Forward--Backward (GFB) splitting algorithm, as proposed in \cite{gfb2011}, for minimizing a large class of composite objectives f+∑i=1nhif + \sum_{i=1}^n h_i on a Hilbert space, where ff has a Lipschitz-continuous gradient and the hih_i's are simple (\ie their proximity operators are easy to compute). We derive iteration-complexity bounds (pointwise and ergodic) for the inexact version of GFB to obtain an approximate solution based on an easily verifiable termination criterion. Along the way, we prove complexity bounds for relaxed and inexact fixed point iterations built from composition of nonexpansive averaged operators. These results apply more generally to GFB when used to find a zero of a sum of n>0n > 0 maximal monotone operators and a co-coercive operator on a Hilbert space. The theoretical findings are exemplified with experiments on video processing.Comment: 5 pages, 2 figure

    Model Consistency of Partly Smooth Regularizers

    Full text link
    This paper studies least-square regression penalized with partly smooth convex regularizers. This class of functions is very large and versatile allowing to promote solutions conforming to some notion of low-complexity. Indeed, they force solutions of variational problems to belong to a low-dimensional manifold (the so-called model) which is stable under small perturbations of the function. This property is crucial to make the underlying low-complexity model robust to small noise. We show that a generalized "irrepresentable condition" implies stable model selection under small noise perturbations in the observations and the design matrix, when the regularization parameter is tuned proportionally to the noise level. This condition is shown to be almost a necessary condition. We then show that this condition implies model consistency of the regularized estimator. That is, with a probability tending to one as the number of measurements increases, the regularized estimator belongs to the correct low-dimensional model manifold. This work unifies and generalizes several previous ones, where model consistency is known to hold for sparse, group sparse, total variation and low-rank regularizations

    Sparse Support Recovery with Non-smooth Loss Functions

    Get PDF
    In this paper, we study the support recovery guarantees of underdetermined sparse regression using the ℓ1\ell_1-norm as a regularizer and a non-smooth loss function for data fidelity. More precisely, we focus in detail on the cases of ℓ1\ell_1 and ℓ∞\ell_\infty losses, and contrast them with the usual ℓ2\ell_2 loss. While these losses are routinely used to account for either sparse (ℓ1\ell_1 loss) or uniform (ℓ∞\ell_\infty loss) noise models, a theoretical analysis of their performance is still lacking. In this article, we extend the existing theory from the smooth ℓ2\ell_2 case to these non-smooth cases. We derive a sharp condition which ensures that the support of the vector to recover is stable to small additive noise in the observations, as long as the loss constraint size is tuned proportionally to the noise level. A distinctive feature of our theory is that it also explains what happens when the support is unstable. While the support is not stable anymore, we identify an "extended support" and show that this extended support is stable to small additive noise. To exemplify the usefulness of our theory, we give a detailed numerical analysis of the support stability/instability of compressed sensing recovery with these different losses. This highlights different parameter regimes, ranging from total support stability to progressively increasing support instability.Comment: in Proc. NIPS 201

    Image Decomposition and Separation Using Sparse Representations: An Overview

    Get PDF
    This paper gives essential insights into the use of sparsity and morphological diversity in image decomposition and source separation by reviewing our recent work in this field. The idea to morphologically decompose a signal into its building blocks is an important problem in signal processing and has far-reaching applications in science and technology. Starck , proposed a novel decomposition method—morphological component analysis (MCA)—based on sparse representation of signals. MCA assumes that each (monochannel) signal is the linear mixture of several layers, the so-called morphological components, that are morphologically distinct, e.g., sines and bumps. The success of this method relies on two tenets: sparsity and morphological diversity. That is, each morphological component is sparsely represented in a specific transform domain, and the latter is highly inefficient in representing the other content in the mixture. Once such transforms are identified, MCA is an iterative thresholding algorithm that is capable of decoupling the signal content. Sparsity and morphological diversity have also been used as a novel and effective source of diversity for blind source separation (BSS), hence extending the MCA to multichannel data. Building on these ingredients, we will provide an overview the generalized MCA introduced by the authors in and as a fast and efficient BSS method. We will illustrate the application of these algorithms on several real examples. We conclude our tour by briefly describing our software toolboxes made available for download on the Internet for sparse signal and image decomposition and separation

    Service Quality and Satisfaction Among Postgraduate Students at Universiti Utara Malaysia

    Get PDF
    Student satisfaction is a vital measure of service quality in educational institutions. Evaluating the university service quality provides an important feedback for the university to assess and develope its service to its students. Education institutions that looks for getting competitive advantage in the future, should search for innovative and effective methods to acquire, maintain and build stronger relationships with their students. The main purpose of this paper is to evaluate students' level of satisfaction toward the services provided by Universiti Utara Malaysia. Furthermore, It aims to determine if their is asignificant relationship between the five dimensions of service quality (tangibility, reliability, responsiveness, assurance and empathy) and students' satisfaction. This research was conducted using a set of questionnaire to 360 postgraduate students including local and international who are currently studying at Universiti Uatara Malaysia, whereby five Likert skale questionnaires used as an instrument to gather the relevant data and information. However, majority of students are satisfied with the facilities provided by the university. The findings of this study will provide the university with some solutions to enhance its performance and increase the number of its students. In general, the results indicated that the five dimensions of service quality were correlated with student satisfaction. Data were collected using survey method, whereby five Likert scale questionnaires will be used as the tool to collect the relevant data and information
    • …
    corecore