31 research outputs found

    Optimization of a Geman-McClure like criterion for sparse signal deconvolution

    Get PDF
    International audienceThis paper deals with the problem of recovering a sparse unknown signal from a set of observations. The latter are obtained by convolution of the original signal and corruption with additive noise. We tackle the problem by minimizing a least-squares fit criterion penalized by a Geman-McClure like potential. The resulting criterion is a rational function, which makes it possible to formulate its minimization as a generalized problem of moments for which a hierarchy of semidefinite programming relaxations can be proposed. These convex relaxations yield a monotone sequence of values which converges to the global optimum. To overcome the computational limitations due to the large number of involved variables, a stochastic block-coordinate descent method is proposed. The algorithm has been implemented and shows promising result

    Rational optimization for nonlinear reconstruction with approximate ℓ0 penalization

    Get PDF
    International audienceRecovering nonlinearly degraded signal in the presence of noise is a challenging problem. In this work, this problem is tackled by minimizing the sum of a non convex least-squares fit criterion and a penalty term. We assume that the nonlinearity of the model can be accounted for by a rational function. In addition, we suppose that the signal to be sought is sparse and a rational approximation of the ℓ0 pseudo-norm thus constitutes a suitable penalization. The resulting composite cost function belongs to the broad class of semi-algebraic functions. To find a globally optimal solution to such an optimization problem, it can be transformed into a generalized moment problem, for which a hierarchy of semidefinite programming relaxations can be built. Global optimality comes at the expense of an increased dimension and, to overcome computational limitations concerning the number of involved variables, the structure of the problem has to be carefully addressed. A situation of practical interest is when the nonlinear model consists of a convolutive transform followed by a componentwise nonlinear rational saturation. We then propose to use a sparse relaxation able to deal with up to several hundreds of optimized variables. In contrast with the naive approach consisting of linearizing the model, our experiments show that the proposed approach offers good performanc

    A Stochastic Majorize-Minimize Subspace Algorithm for Online Penalized Least Squares Estimation

    Full text link
    Stochastic approximation techniques play an important role in solving many problems encountered in machine learning or adaptive signal processing. In these contexts, the statistics of the data are often unknown a priori or their direct computation is too intensive, and they have thus to be estimated online from the observed signals. For batch optimization of an objective function being the sum of a data fidelity term and a penalization (e.g. a sparsity promoting function), Majorize-Minimize (MM) methods have recently attracted much interest since they are fast, highly flexible, and effective in ensuring convergence. The goal of this paper is to show how these methods can be successfully extended to the case when the data fidelity term corresponds to a least squares criterion and the cost function is replaced by a sequence of stochastic approximations of it. In this context, we propose an online version of an MM subspace algorithm and we study its convergence by using suitable probabilistic tools. Simulation results illustrate the good practical performance of the proposed algorithm associated with a memory gradient subspace, when applied to both non-adaptive and adaptive filter identification problems

    영상 복원 문제의 변분법적 접근

    Get PDF
    학위논문 (박사)-- 서울대학교 대학원 : 수리과학부, 2013. 2. 강명주.Image restoration has been an active research area in image processing and computer vision during the past several decades. We explore variational partial differential equations (PDE) models in image restoration problem. We start our discussion by reviewing classical models, by which the works of this dissertation are highly motivated. The content of the dissertation is divided into two main subjects. First topic is on image denoising, where we propose non-convex hybrid total variation model, and then we apply iterative reweighted algorithm to solve the proposed model. Second topic is on image decomposition, in which we separate an image into structural component and oscillatory component using local gradient constraint.Abstract i 1 Introduction 1 1.1 Image restoration 2 1.2 Brief overview of the dissertation 3 2 Previous works 4 2.1 Image denoising 4 2.1.1 Fundamental model 4 2.1.2 Higher order model 7 2.1.3 Hybrid model 9 2.1.4 Non-convex model 12 2.2 Image decomposition 22 2.2.1 Meyers model 23 2.2.2 Nonlinear filter 24 3 Non-convex hybrid TV for image denoising 28 3.1 Variational model with non-convex hybrid TV 29 3.1.1 Non-convex TV model and non-convex HOTV model 29 3.1.2 The Proposed model: Non-convex hybrid TV model 31 3.2 Iterative reweighted hybrid Total Variation algorithm 33 3.3 Numerical experiments 35 3.3.1 Parameter values 37 3.3.2 Comparison between the non-convex TV model and the non-convex HOTV model 38 3.3.3 Comparison with other non-convex higher order regularizers 40 3.3.4 Comparison between two non-convex hybrid TV models 42 3.3.5 Comparison with Krishnan et al. [39] 43 3.3.6 Comparison with state-of-the-art 44 4 Image decomposition 59 4.1 Local gradient constraint 61 4.1.1 Texture estimator 62 4.2 The proposed model 65 4.2.1 Algorithm : Anisotropic TV-L2 67 4.2.2 Algorithm : Isotropic TV-L2 69 4.2.3 Algorithm : Isotropic TV-L1 71 4.3 Numerical experiments and discussion 72 5 Conclusion and future works 80 Abstract (in Korean) 92Docto

    Statistical inference in radio astronomy

    Get PDF
    This thesis unifies several studies, which all are dedicated to the subject of statistical data analysis in radio astronomy and radio astrophysics. Radio astronomy, like astronomy as a whole, has undergone a remarkable development in the past twenty years in introducing new instruments and technologies. New telescopes like the upgraded VLA, LOFAR, or the SKA and its pathfinder missions offer unprecedented sensitivities, previously uncharted frequency domains and unmatched survey capabilities. Many of these have the potential to significantly advance the science of radio astrophysics and cosmology on all scales, from solar and stellar physics, Galactic astrophysics and cosmic magnetic fields, to Galaxy cluster astrophysics and signals from the epoch of reionization. Since then, radio data analysis, calibration and imaging techniques have entered a similar phase of new development to push the boundaries and adapt the field to the new instruments and scientific opportunities. This thesis contributes to these greater developments in two specific subjects, radio interferometric imaging and cosmic magnetic field statistics. Throughout this study, different data analysis techniques are presented and employed in various settings, but all can be summarized under the broad term of statistical infer- ence. This subject encompasses a huge variety of statistical techniques, developed to solve problems in which deductions have to be made from incomplete knowledge, data or measurements. This study focuses especially on Bayesian inference methods that make use of a subjective definition of probabilities, allowing for the expression of probabilities and statistical knowledge prior to an actual measurement. The thesis contains two different sets of application for such techniques. First, situations where a complicated, and generally ill-posed measurement problem can be approached by assuming a statistical signal model prior to infer the desired measured variable. Such a problem very often is met should the measurement device take less data then needed to constrain all degrees of freedom of the problem. The principal case investigated in this thesis is the measurement problem of a radio interferometer, which takes incomplete samples of the Fourier transformed intensity of the radio emission in the sky, such that it is impossible to exactly recover the signal. The new imaging algorithm RESOLVE is presented, optimal for extended radio sources. A first showcase demonstrates the performance of the new technique on real data. Further, a new Bayesian approach to multi-frequency radio interferometric imaging is presented and integrated into RESOLVE. The second field of application are astrophysical problems, in which the inherent stochas- tic nature of a physical process demands a description, where properties of physical quanti- ties can only be statistically estimated. Astrophysical plasmas for instance are very often in a turbulent state, and thus governed by statistical hydrodynamical laws. Two studies are presented that show how properties of turbulent plasma magnetic fields can be inferred from radio observations

    Phenomenological modeling of image irradiance for non-Lambertian surfaces under natural illumination.

    Get PDF
    Various vision tasks are usually confronted by appearance variations due to changes of illumination. For instance, in a recognition system, it has been shown that the variability in human face appearance is owed to changes to lighting conditions rather than person\u27s identity. Theoretically, due to the arbitrariness of the lighting function, the space of all possible images of a fixed-pose object under all possible illumination conditions is infinite dimensional. Nonetheless, it has been proven that the set of images of a convex Lambertian surface under distant illumination lies near a low dimensional linear subspace. This result was also extended to include non-Lambertian objects with non-convex geometry. As such, vision applications, concerned with the recovery of illumination, reflectance or surface geometry from images, would benefit from a low-dimensional generative model which captures appearance variations w.r.t. illumination conditions and surface reflectance properties. This enables the formulation of such inverse problems as parameter estimation. Typically, subspace construction boils to performing a dimensionality reduction scheme, e.g. Principal Component Analysis (PCA), on a large set of (real/synthesized) images of object(s) of interest with fixed pose but different illumination conditions. However, this approach has two major problems. First, the acquired/rendered image ensemble should be statistically significant vis-a-vis capturing the full behavior of the sources of variations that is of interest, in particular illumination and reflectance. Second, the curse of dimensionality hinders numerical methods such as Singular Value Decomposition (SVD) which becomes intractable especially with large number of large-sized realizations in the image ensemble. One way to bypass the need of large image ensemble is to construct appearance subspaces using phenomenological models which capture appearance variations through mathematical abstraction of the reflection process. In particular, the harmonic expansion of the image irradiance equation can be used to derive an analytic subspace to represent images under fixed pose but different illumination conditions where the image irradiance equation has been formulated in a convolution framework. Due to their low-frequency nature, irradiance signals can be represented using low-order basis functions, where Spherical Harmonics (SH) has been extensively adopted. Typically, an ideal solution to the image irradiance (appearance) modeling problem should be able to incorporate complex illumination, cast shadows as well as realistic surface reflectance properties, while moving away from the simplifying assumptions of Lambertian reflectance and single-source distant illumination. By handling arbitrary complex illumination and non-Lambertian reflectance, the appearance model proposed in this dissertation moves the state of the art closer to the ideal solution. This work primarily addresses the geometrical compliance of the hemispherical basis for representing surface reflectance while presenting a compact, yet accurate representation for arbitrary materials. To maintain the plausibility of the resulting appearance, the proposed basis is constructed in a manner that satisfies the Helmholtz reciprocity property while avoiding high computational complexity. It is believed that having the illumination and surface reflectance represented in the spherical and hemispherical domains respectively, while complying with the physical properties of the surface reflectance would provide better approximation accuracy of image irradiance when compared to the representation in the spherical domain. Discounting subsurface scattering and surface emittance, this work proposes a surface reflectance basis, based on hemispherical harmonics (HSH), defined on the Cartesian product of the incoming and outgoing local hemispheres (i.e. w.r.t. surface points). This basis obeys physical properties of surface reflectance involving reciprocity and energy conservation. The basis functions are validated using analytical reflectance models as well as scattered reflectance measurements which might violate the Helmholtz reciprocity property (this can be filtered out through the process of projecting them on the subspace spanned by the proposed basis, where the reciprocity property is preserved in the least-squares sense). The image formation process of isotropic surfaces under arbitrary distant illumination is also formulated in the frequency space where the orthogonality relation between illumination and reflectance bases is encoded in what is termed as irradiance harmonics. Such harmonics decouple the effect of illumination and reflectance from the underlying pose and geometry. Further, a bilinear approach to analytically construct irradiance subspace is proposed in order to tackle the inherent problem of small-sample-size and curse of dimensionality. The process of finding the analytic subspace is posed as establishing a relation between its principal components and that of the irradiance harmonics basis functions. It is also shown how to incorporate prior information about natural illumination and real-world surface reflectance characteristics in order to capture the full behavior of complex illumination and non-Lambertian reflectance. The use of the presented theoretical framework to develop practical algorithms for shape recovery is further presented where the hitherto assumed Lambertian assumption is relaxed. With a single image of unknown general illumination, the underlying geometrical structure can be recovered while accounting explicitly for object reflectance characteristics (e.g. human skin types for facial images and teeth reflectance for human jaw reconstruction) as well as complex illumination conditions. Experiments on synthetic and real images illustrate the robustness of the proposed appearance model vis-a-vis illumination variation. Keywords: computer vision, computer graphics, shading, illumination modeling, reflectance representation, image irradiance, frequency space representations, {hemi)spherical harmonics, analytic bilinear PCA, model-based bilinear PCA, 3D shape reconstruction, statistical shape from shading

    Algorithms for Multiclass Classification and Regularized Regression

    Get PDF

    Algorithms for Multiclass Classification and Regularized Regression

    Get PDF

    Inversion pour image texturée : déconvolution myope non supervisée, choix de modèles, déconvolution-segmentation

    Get PDF
    This thesis is addressing a series of inverse problems of major importance in the fieldof image processing (image segmentation, model choice, parameter estimation, deconvolution)in the context of textured images. In all of the aforementioned problems theobservations are indirect, i.e., the textured images are affected by a blur and by noise. Thecontributions of this work belong to three main classes: modeling, methodological andalgorithmic. From the modeling standpoint, the contribution consists in the development of a newnon-Gaussian model for textures. The Fourier coefficients of the textured images are modeledby a Scale Mixture of Gaussians Random Field. The Power Spectral Density of thetexture has a parametric form, driven by a set of parameters that encode the texture characteristics.The methodological contribution is threefold and consists in solving three image processingproblems that have not been tackled so far in the context of indirect observationsof textured images. All the proposed methods are Bayesian and are based on the exploitingthe information encoded in the a posteriori law. The first method that is proposed is devotedto the myopic deconvolution of a textured image and the estimation of its parameters.The second method achieves joint model selection and model parameters estimation froman indirect observation of a textured image. Finally, the third method addresses the problemof joint deconvolution and segmentation of an image composed of several texturedregions, while estimating at the same time the parameters of each constituent texture.Last, but not least, the algorithmic contribution is represented by the development ofa new efficient version of the Metropolis Hastings algorithm, with a directional componentof the proposal function based on the”Newton direction” and the Fisher informationmatrix. This particular directional component allows for an efficient exploration of theparameter space and, consequently, increases the convergence speed of the algorithm.To summarize, this work presents a series of methods to solve three image processingproblems in the context of blurry and noisy textured images. Moreover, we present twoconnected contributions, one regarding the texture models andone meant to enhance theperformances of the samplers employed for all of the three methods.Ce travail est dédié à la résolution de plusieurs problèmes de grand intérêt en traitement d’images : segmentation, choix de modèle et estimation de paramètres, pour le cas spécifique d’images texturées indirectement observées (convoluées et bruitées). Dans ce contexte, les contributions de cette thèse portent sur trois plans différents : modéle, méthode et algorithmique.Du point de vue modélisation de la texture, un nouveaumodèle non-gaussien est proposé. Ce modèle est défini dans le domaine de Fourier et consiste en un mélange de Gaussiennes avec une Densité Spectrale de Puissance paramétrique.Du point de vueméthodologique, la contribution est triple –troisméthodes Bayésiennes pour résoudre de manière :–optimale–non-supervisée–des problèmes inverses en imagerie dans le contexte d’images texturées ndirectement observées, problèmes pas abordés dans la littérature jusqu’à présent.Plus spécifiquement,1. la première méthode réalise la déconvolution myope non-supervisée et l’estimation des paramètres de la texture,2. la deuxième méthode est dédiée à la déconvolution non-supervisée, le choix de modèle et l’estimation des paramètres de la texture et, finalement,3. la troisième méthode déconvolue et segmente une image composée de plusieurs régions texturées, en estimant au même temps les hyperparamètres (niveau du signal et niveau du bruit) et les paramètres de chaque texture.La contribution sur le plan algorithmique est représentée par une nouvelle version rapide de l’algorithme Metropolis-Hastings. Cet algorithme est basé sur une loi de proposition directionnelle contenant le terme de la ”direction de Newton”. Ce terme permet une exploration rapide et efficace de l’espace des paramètres et, de ce fait, accélère la convergence
    corecore