146 research outputs found

    A Time-Evolving 3D Method Dedicated to the Reconstruction of Solar plumes and Results Using Extreme Ultra-Violet Data

    Get PDF
    An important issue in the tomographic reconstruction of the solar poles is the relatively rapid evolution of the polar plumes. We demonstrate that it is possible to take into account this temporal evolution in the reconstruction. The difficulty of this problem comes from the fact that we want a 4D reconstruction (three spatial dimensions plus time) while we only have 3D data (2D images plus time). To overcome this difficulty, we introduce a model that describes polar plumes as stationary objects whose intensity varies homogeneously with time. This assumption can be physically justified if one accepts the stability of the magnetic structure. This model leads to a bilinear inverse problem. We describe how to extend linear inversion methods to these kinds of problems. Studies of simulations show the reliability of our method. Results for SOHO/EIT data show that we are able to estimate the temporal evolution of polar plumes in order to improve the reconstruction of the solar poles from only one point of view. We expect further improvements from STEREO/EUVI data when the two probes will be separated by about 60 degrees

    A measure-theoretic variational Bayesian algorithm for large dimensional problems

    No full text
    International audienceIn this paper we provide an algorithm allowing to solve the variational Bayesian issue as a functional optimization problem. The main contribution of this paper is to transpose a classical iterative algorithm of optimization in the metric space of probability densities involved in the Bayesian methodology. The main advantage of this methodology is that it allows to address large dimensional inverse problems by unsupervised algorithms. The interest of our algorithm is enhanced by its application to large dimensional linear inverse problems involving sparse objects. Finally,we provide simulation results. First we show the good numerical performances of our method by comparing it with classical ones on a small tomographic problem. On a second time we treat a large dimensional dictionary learning problem and compare our method with a wavelet based one

    A gradient-like variational Bayesian algorithm

    Get PDF
    International audienceIn this paper we provide a new algorithm allowing to solve a variational Bayesian issue which can be seen as a functional optimization problem. The main contribution of this paper is to transpose a classical iterative algorithm of optimization in the metric space of probability densities involved in the Bayesian methodology. Another important part is the application of our algorithm to a class of linear inverse problems where estimated quantities are assumed to be sparse. Finally, we compare performances of our method with classical ones on a tomographic problem. Preliminary results on a small dimensional example show that our new algorithm is faster than the classical approaches for the same quality of reconstruction

    Efficient Variational Bayesian Approximation Method Based on Subspace optimization

    No full text
    International audienceVariational Bayesian approximations have been widely used in fully Bayesian inference for approx- imating an intractable posterior distribution by a separable one. Nevertheless, the classical variational Bayesian approximation (VBA) method suffers from slow convergence to the approximate solution when tackling large-dimensional problems. To address this problem, we propose in this paper an improved VBA method. Actually, variational Bayesian issue can be seen as a convex functional optimization problem. The proposed method is based on the adaptation of subspace optimization methods in Hilbert spaces to the function space involved, in order to solve this optimization problem in an iterative way. The aim is to determine an optimal direction at each iteration in order to get a more efficient method. We highlight the efficiency of our new VBA method and its application to image processing by considering an ill-posed linear inverse problem using a total variation prior. Comparisons with state of the art variational Bayesian methods through a numerical example show the notable improved computation time

    Bayesian estimation of regularization and point spread function parameters for Wiener-Hunt deconvolution

    Get PDF
    International audienceThis paper tackles the problem of image deconvolution with joint estimation of point spread function (PSF) parameters and hyperparameters. Within a Bayesian framework, the solution is inferred via a global a posteriori law for unknown parameters and object. The estimate is chosen as the posterior mean, numerically calculated by means of a Monte Carlo Markov chain algorithm. The estimates are efficiently computed in the Fourier domain, and the effectiveness of the method is shown on simulated examples. Results show precise estimates for PSF parameters and hyperparameters as well as precise image estimates including restoration of high frequencies and spatial details, within a global and coherent approach

    Data Inversion for Over-Resolved Spectral Imaging in Astronomy

    No full text
    International audienceWe present an original method for reconstructing a 3-D object having two spatial dimensions and one spectral dimension from data provided by the infrared slit spectrograph on board the Spitzer Space Telescope. During acquisition, the light flux is deformed by a complex process comprising four main elements (the telescope aperture, the slit, the diffraction grating, and optical distortion) before it reaches the 2-D sensor. The originality of this work lies in the physical modeling, in integral form, of this process of data formation in continuous variables. The inversion is also approached with continuous variable in a semi-parametric format decomposing the object into a family of Gaussian functions. The estimate is built in a deterministic regularization framework as the minimizer of a quadratic criterion. These specificities give our method the power to over-resolve. It performance is illustrated using real and simulated data. We also present a study of the resolution showing a 1.5-fold improvement relative to conventional methods

    Inversion de données pour l'imagerie spectrale sur-résolue en astronomie

    No full text
    International audienceNous nous intéressons à l'inversion de données infrarouges issues du spectromètre IRS du satellite SPITZER. Les obstacles rencontrés sont la complexité de l'instrument et un phénomène de sous-échantillonnage. À l'aide d'un modèle instrument fidèle et rapide et la redondance des données nous avons développé une méthode originale d'estimation d'un ciel sur-résolu. Elle repose sur l'inversion des données par minimisation d'un critère quadratique calculé avec un algorithme de descente. Les premiers résultats mettent en évidence un gain significatif en résolution

    Variational Bayesian approximation with scale mixture prior: a comparison between three algorithms

    Get PDF
    International audienceOur aim is to solve a linear inverse problem using various methods based on the Variational Bayesian Approximation (VBA). We choose to take sparsity into account via a scale mixture prior, more precisely a student-t model. The joint posterior of the unknown and hidden variable of the mixtures is approximated via the VBA. To do this approximation, classically the alternate algorithm is used. But this method is not the most efficient. Recently other optimization algorithms have been proposed; indeed classical iterative algorithms of optimization such as the steepest descent method and the conjugate gradient have been studied in the space of the probability densities involved in the Bayesian methodology to treat this problem. The main object of this work is to present these three algorithms and a numerical comparison of their performances

    Algorithme rapide de reconstruction tomographique basé sur la compression des calculs par ondelettes

    Get PDF
    L'introduction de nouveaux systèmes de tomographie 3D à partir de détecteurs multi-lignes ou de détecteurs plats fait augmenter le nombre de données à traiter. De plus, pour certaines applications médicales le temps de reconstruction doit être réduit ( tomofluoroscopie 3D). Nous avons donc développé un nouvel algorithme de reconstruction 3D basé sur la compression des calculs. L'idée principale est d'adapter les techniques de compression à base d'ondelettes à la reconstruction tomographique. Pour cela, nous calculons une transformée en ondelettes indirecte de l'image f à travers la décomposition de ses projections (ou transformée de Radon) Rf. Cette approche est hiérarchique. En effet, nous reconstruisons dans un premier temps, les coefficients d'ondelettes aux échelles grossières, à partir de ces coefficients nous prédisons les coefficients significatifs aux échelles plus fines. Cette prédiction est obtenue en utilisant la structure des "Zerotree" introduite par J. Shapiro dans le cadre de la compression de données. En conclusion notre approche permet de rétroprojeter uniquement les coefficients contenant de l'information pertinante. Elle permet de reconstruire de 2 à 5 fois plus vite que une approche classique FBP (Filtered Back Projection) un volume tomographique (32 x 512 x 512)
    corecore