58,933 research outputs found

    Time and spectral domain relative entropy: A new approach to multivariate spectral estimation

    Full text link
    The concept of spectral relative entropy rate is introduced for jointly stationary Gaussian processes. Using classical information-theoretic results, we establish a remarkable connection between time and spectral domain relative entropy rates. This naturally leads to a new spectral estimation technique where a multivariate version of the Itakura-Saito distance is employed}. It may be viewed as an extension of the approach, called THREE, introduced by Byrnes, Georgiou and Lindquist in 2000 which, in turn, followed in the footsteps of the Burg-Jaynes Maximum Entropy Method. Spectral estimation is here recast in the form of a constrained spectrum approximation problem where the distance is equal to the processes relative entropy rate. The corresponding solution entails a complexity upper bound which improves on the one so far available in the multichannel framework. Indeed, it is equal to the one featured by THREE in the scalar case. The solution is computed via a globally convergent matricial Newton-type algorithm. Simulations suggest the effectiveness of the new technique in tackling multivariate spectral estimation tasks, especially in the case of short data records.Comment: 32 pages, submitted for publicatio

    Likelihood Analysis of Power Spectra and Generalized Moment Problems

    Full text link
    We develop an approach to spectral estimation that has been advocated by Ferrante, Masiero and Pavon and, in the context of the scalar-valued covariance extension problem, by Enqvist and Karlsson. The aim is to determine the power spectrum that is consistent with given moments and minimizes the relative entropy between the probability law of the underlying Gaussian stochastic process to that of a prior. The approach is analogous to the framework of earlier work by Byrnes, Georgiou and Lindquist and can also be viewed as a generalization of the classical work by Burg and Jaynes on the maximum entropy method. In the present paper we present a new fast algorithm in the general case (i.e., for general Gaussian priors) and show that for priors with a specific structure the solution can be given in closed form.Comment: 17 pages, 4 figure

    Component separation methods for the Planck mission

    Get PDF
    The Planck satellite will map the full sky at nine frequencies from 30 to 857 GHz. The CMB intensity and polarization that are its prime targets are contaminated by foreground emission. The goal of this paper is to compare proposed methods for separating CMB from foregrounds based on their different spectral and spatial characteristics, and to separate the foregrounds into components of different physical origin. A component separation challenge has been organized, based on a set of realistically complex simulations of sky emission. Several methods including those based on internal template subtraction, maximum entropy method, parametric method, spatial and harmonic cross correlation methods, and independent component analysis have been tested. Different methods proved to be effective in cleaning the CMB maps from foreground contamination, in reconstructing maps of diffuse Galactic emissions, and in detecting point sources and thermal Sunyaev-Zeldovich signals. The power spectrum of the residuals is, on the largest scales, four orders of magnitude lower than that of the input Galaxy power spectrum at the foreground minimum. The CMB power spectrum was accurately recovered up to the sixth acoustic peak. The point source detection limit reaches 100 mJy, and about 2300 clusters are detected via the thermal SZ effect on two thirds of the sky. We have found that no single method performs best for all scientific objectives. We foresee that the final component separation pipeline for Planck will involve a combination of methods and iterations between processing steps targeted at different objectives such as diffuse component separation, spectral estimation and compact source extraction.Comment: Matches version accepted by A&A. A version with high resolution figures is available at http://people.sissa.it/~leach/compsepcomp.pd

    Estimating the Spectrum in Computed Tomography Via Kullback–Leibler Divergence Constrained Optimization

    Get PDF
    Purpose We study the problem of spectrum estimation from transmission data of a known phantom. The goal is to reconstruct an x‐ray spectrum that can accurately model the x‐ray transmission curves and reflects a realistic shape of the typical energy spectra of the CT system. Methods Spectrum estimation is posed as an optimization problem with x‐ray spectrum as unknown variables, and a Kullback–Leibler (KL)‐divergence constraint is employed to incorporate prior knowledge of the spectrum and enhance numerical stability of the estimation process. The formulated constrained optimization problem is convex and can be solved efficiently by use of the exponentiated‐gradient (EG) algorithm. We demonstrate the effectiveness of the proposed approach on the simulated and experimental data. The comparison to the expectation–maximization (EM) method is also discussed. Results In simulations, the proposed algorithm is seen to yield x‐ray spectra that closely match the ground truth and represent the attenuation process of x‐ray photons in materials, both included and not included in the estimation process. In experiments, the calculated transmission curve is in good agreement with the measured transmission curve, and the estimated spectra exhibits physically realistic looking shapes. The results further show the comparable performance between the proposed optimization‐based approach and EM. Conclusions Our formulation of a constrained optimization provides an interpretable and flexible framework for spectrum estimation. Moreover, a KL‐divergence constraint can include a prior spectrum and appears to capture important features of x‐ray spectrum, allowing accurate and robust estimation of x‐ray spectrum in CT imaging

    Performance Study of the Robust Bayesian Regularization Technique for Remote Sensing Imaging in Geophysical Applications

    Get PDF
    In this paper, a performance study of a methodology for reconstruction of high-resolution remote sensing imagery is presented. This method is the robust version of the Bayesian regularization (BR) technique, which performs the image reconstruction as a solution of the ill-conditioned inverse spatial spectrum pattern (SSP) estimation problem with model uncertainties via unifying the Bayesian minimum risk (BMR) estimation strategy with the maximum entropy (ME) randomized a priori image model and other projection-type regularization constraints imposed on the solution. The results of extended comparative simulation study of a family of image formation/enhancement algorithms that employ the RBR method for high-resolution reconstruction of the SSP is presented. Moreover, the computational complexity of different methods are analyzed and reported together with the scene imaging protocols. The advantages of the remote sensing imaging experiment (that employ the BR-based estimator) over the cases of poorer designed experiments (that employ the conventional matched spatial filtering as well as the least squares techniques) are verified trough the simulation study. Finally, the application of this estimator in geophysical applications of remote sensing imagery is described.Universidad de Guadalajar

    A globally convergent matricial algorithm for multivariate spectral estimation

    Full text link
    In this paper, we first describe a matricial Newton-type algorithm designed to solve the multivariable spectrum approximation problem. We then prove its global convergence. Finally, we apply this approximation procedure to multivariate spectral estimation, and test its effectiveness through simulation. Simulation shows that, in the case of short observation records, this method may provide a valid alternative to standard multivariable identification techniques such as MATLAB's PEM and MATLAB's N4SID

    MaxEnt power spectrum estimation using the Fourier transform for irregularly sampled data applied to a record of stellar luminosity

    Full text link
    The principle of maximum entropy is applied to the spectral analysis of a data signal with general variance matrix and containing gaps in the record. The role of the entropic regularizer is to prevent one from overestimating structure in the spectrum when faced with imperfect data. Several arguments are presented suggesting that the arbitrary prefactor should not be introduced to the entropy term. The introduction of that factor is not required when a continuous Poisson distribution is used for the amplitude coefficients. We compare the formalism for when the variance of the data is known explicitly to that for when the variance is known only to lie in some finite range. The result of including the entropic measure factor is to suggest a spectrum consistent with the variance of the data which has less structure than that given by the forward transform. An application of the methodology to example data is demonstrated.Comment: 15 pages, 13 figures, 1 table, major revision, final version, Accepted for publication in Astrophysics & Space Scienc
    corecore