91 research outputs found

    Constrained Overcomplete Analysis Operator Learning for Cosparse Signal Modelling

    Get PDF
    We consider the problem of learning a low-dimensional signal model from a collection of training samples. The mainstream approach would be to learn an overcomplete dictionary to provide good approximations of the training samples using sparse synthesis coefficients. This famous sparse model has a less well known counterpart, in analysis form, called the cosparse analysis model. In this new model, signals are characterised by their parsimony in a transformed domain using an overcomplete (linear) analysis operator. We propose to learn an analysis operator from a training corpus using a constrained optimisation framework based on L1 optimisation. The reason for introducing a constraint in the optimisation framework is to exclude trivial solutions. Although there is no final answer here for which constraint is the most relevant constraint, we investigate some conventional constraints in the model adaptation field and use the uniformly normalised tight frame (UNTF) for this purpose. We then derive a practical learning algorithm, based on projected subgradients and Douglas-Rachford splitting technique, and demonstrate its ability to robustly recover a ground truth analysis operator, when provided with a clean training set, of sufficient size. We also find an analysis operator for images, using some noisy cosparse signals, which is indeed a more realistic experiment. As the derived optimisation problem is not a convex program, we often find a local minimum using such variational methods. Some local optimality conditions are derived for two different settings, providing preliminary theoretical support for the well-posedness of the learning problem under appropriate conditions.Comment: 29 pages, 13 figures, accepted to be published in TS

    Greedy-Like Algorithms for the Cosparse Analysis Model

    Get PDF
    The cosparse analysis model has been introduced recently as an interesting alternative to the standard sparse synthesis approach. A prominent question brought up by this new construction is the analysis pursuit problem -- the need to find a signal belonging to this model, given a set of corrupted measurements of it. Several pursuit methods have already been proposed based on ℓ1\ell_1 relaxation and a greedy approach. In this work we pursue this question further, and propose a new family of pursuit algorithms for the cosparse analysis model, mimicking the greedy-like methods -- compressive sampling matching pursuit (CoSaMP), subspace pursuit (SP), iterative hard thresholding (IHT) and hard thresholding pursuit (HTP). Assuming the availability of a near optimal projection scheme that finds the nearest cosparse subspace to any vector, we provide performance guarantees for these algorithms. Our theoretical study relies on a restricted isometry property adapted to the context of the cosparse analysis model. We explore empirically the performance of these algorithms by adopting a plain thresholding projection, demonstrating their good performance

    A review of cosparse signal recovery methods applied to sound source localization

    Get PDF
    National audienceThis work aims at comparing several state-of-the-art methods for cosparse signal recovery, in the context of sound source localization. We assess the performance of ve cosparse recovery algorithms: Greedy Analysis Structured Pursuit, l1 and joint l1,2 minimization, Structured Analysis Iterative Hard Thresholding and Structured Analysis Hard Thresholding Pursuit. In addition, we evaluate the performance of these methods against the sparse synthesis paradigm, solved with corresponding joint l1,2 minimization method. For this evaluation, the chosen applicative showcase is sound source localization from simulated measurements of the acoustic pressure eld.L'objectif de cet article est de comparer plusieurs m ethodes de l' etat de l'art pour la reconstruction coparcimonieuse de signaux, dans le contexte de la localisation de sources sonores. Nous evaluons les performances de cinq algorithmes de reconstruction coparcimonieuse : l'algorithme de "Greedy Analysis Structured Pursuit", les minimisations l1 et l1,2 jointe, ainsi que les algorithmes "Structured Analysis Iterative Hard Thresholding" et "Structured Analysis Hard Thresholding Pursuit". Nous comparons egalement ces algorithmes a l'approche de parcimonie a la synth ese, que nous r esolvons par la minimisation jointe l1,2 correspondante. Nous illustrons nos r esultats dans le cadre d'une application a la localisation de sources sonores, r ealise sur des simulations de mesures de champs de pression acoustique
    • 

    corecore