2,138 research outputs found

    Extracting HI cosmological signal with Generalized Needlet Internal Linear Combination

    Full text link
    HI intensity mapping is a new observational technique to map fluctuations in the large-scale structure of matter using the 21 cm emission line of atomic hydrogen (HI). Sensitive radio surveys have the potential to detect Baryon Acoustic Oscillations (BAO) at low redshifts (z < 1) in order to constrain the properties of dark energy. Observations of the HI signal will be contaminated by instrumental noise and, more significantly, by astrophysical foregrounds, such as Galactic synchrotron emission, which is at least four orders of magnitude brighter than the HI signal. Foreground cleaning is recognised as one of the key challenges for future radio astronomy surveys. We study the ability of the Generalized Needlet Internal Linear Combination (GNILC) method to subtract radio foregrounds and to recover the cosmological HI signal for a general HI intensity mapping experiment. The GNILC method is a new technique that uses both frequency and spatial information to separate the components of the observed data. Our results show that the method is robust to the complexity of the foregrounds. For simulated radio observations including HI emission, Galactic synchrotron, Galactic free-free, radio sources and 0.05 mK thermal noise, we find that we can reconstruct the HI power spectrum for multipoles 30 < l < 150 with 6% accuracy on 50% of the sky for a redshift z ~ 0.25.Comment: 20 pages, 13 figures. Updated to match version accepted by MNRA

    Foreground component separation with generalised ILC

    Full text link
    The 'Internal Linear Combination' (ILC) component separation method has been extensively used to extract a single component, the CMB, from the WMAP multifrequency data. We generalise the ILC approach for separating other millimetre astrophysical emissions. We construct in particular a multidimensional ILC filter, which can be used, for instance, to estimate the diffuse emission of a complex component originating from multiple correlated emissions, such as the total emission of the Galactic interstellar medium. The performance of such generalised ILC methods, implemented on a needlet frame, is tested on simulations of Planck mission observations, for which we successfully reconstruct a low noise estimate of emission from astrophysical foregrounds with vanishing CMB and SZ contamination.Comment: 11 pages, 6 figures (2 figures added), 1 reference added, introduction expanded, V2: version accepted by MNRA

    Construction of Hilbert Transform Pairs of Wavelet Bases and Gabor-like Transforms

    Get PDF
    We propose a novel method for constructing Hilbert transform (HT) pairs of wavelet bases based on a fundamental approximation-theoretic characterization of scaling functions--the B-spline factorization theorem. In particular, starting from well-localized scaling functions, we construct HT pairs of biorthogonal wavelet bases of L^2(R) by relating the corresponding wavelet filters via a discrete form of the continuous HT filter. As a concrete application of this methodology, we identify HT pairs of spline wavelets of a specific flavor, which are then combined to realize a family of complex wavelets that resemble the optimally-localized Gabor function for sufficiently large orders. Analytic wavelets, derived from the complexification of HT wavelet pairs, exhibit a one-sided spectrum. Based on the tensor-product of such analytic wavelets, and, in effect, by appropriately combining four separable biorthogonal wavelet bases of L^2(R^2), we then discuss a methodology for constructing 2D directional-selective complex wavelets. In particular, analogous to the HT correspondence between the components of the 1D counterpart, we relate the real and imaginary components of these complex wavelets using a multi-dimensional extension of the HT--the directional HT. Next, we construct a family of complex spline wavelets that resemble the directional Gabor functions proposed by Daugman. Finally, we present an efficient FFT-based filterbank algorithm for implementing the associated complex wavelet transform.Comment: 36 pages, 8 figure

    Data-driven time-frequency analysis of multivariate data

    No full text
    Empirical Mode Decomposition (EMD) is a data-driven method for the decomposition and time-frequency analysis of real world nonstationary signals. Its main advantages over other time-frequency methods are its locality, data-driven nature, multiresolution-based decomposition, higher time-frequency resolution and its ability to capture oscillation of any type (nonharmonic signals). These properties have made EMD a viable tool for real world nonstationary data analysis. Recent advances in sensor and data acquisition technologies have brought to light new classes of signals containing typically several data channels. Currently, such signals are almost invariably processed channel-wise, which is suboptimal. It is, therefore, imperative to design multivariate extensions of the existing nonlinear and nonstationary analysis algorithms as they are expected to give more insight into the dynamics and the interdependence between multiple channels of such signals. To this end, this thesis presents multivariate extensions of the empirical mode de- composition algorithm and illustrates their advantages with regards to multivariate non- stationary data analysis. Some important properties of such extensions are also explored, including their ability to exhibit wavelet-like dyadic filter bank structures for white Gaussian noise (WGN), and their capacity to align similar oscillatory modes from multiple data channels. Owing to the generality of the proposed methods, an improved multi- variate EMD-based algorithm is introduced which solves some inherent problems in the original EMD algorithm. Finally, to demonstrate the potential of the proposed methods, simulations on the fusion of multiple real world signals (wind, images and inertial body motion data) support the analysis

    Wavelet/shearlet hybridized neural networks for biomedical image restoration

    Get PDF
    Recently, new programming paradigms have emerged that combine parallelism and numerical computations with algorithmic differentiation. This approach allows for the hybridization of neural network techniques for inverse imaging problems with more traditional methods such as wavelet-based sparsity modelling techniques. The benefits are twofold: on the one hand traditional methods with well-known properties can be integrated in neural networks, either as separate layers or tightly integrated in the network, on the other hand, parameters in traditional methods can be trained end-to-end from datasets in a neural network "fashion" (e.g., using Adagrad or Adam optimizers). In this paper, we explore these hybrid neural networks in the context of shearlet-based regularization for the purpose of biomedical image restoration. Due to the reduced number of parameters, this approach seems a promising strategy especially when dealing with small training data sets

    The Data Big Bang and the Expanding Digital Universe: High-Dimensional, Complex and Massive Data Sets in an Inflationary Epoch

    Get PDF
    Recent and forthcoming advances in instrumentation, and giant new surveys, are creating astronomical data sets that are not amenable to the methods of analysis familiar to astronomers. Traditional methods are often inadequate not merely because of the size in bytes of the data sets, but also because of the complexity of modern data sets. Mathematical limitations of familiar algorithms and techniques in dealing with such data sets create a critical need for new paradigms for the representation, analysis and scientific visualization (as opposed to illustrative visualization) of heterogeneous, multiresolution data across application domains. Some of the problems presented by the new data sets have been addressed by other disciplines such as applied mathematics, statistics and machine learning and have been utilized by other sciences such as space-based geosciences. Unfortunately, valuable results pertaining to these problems are mostly to be found only in publications outside of astronomy. Here we offer brief overviews of a number of concepts, techniques and developments, some "old" and some new. These are generally unknown to most of the astronomical community, but are vital to the analysis and visualization of complex datasets and images. In order for astronomers to take advantage of the richness and complexity of the new era of data, and to be able to identify, adopt, and apply new solutions, the astronomical community needs a certain degree of awareness and understanding of the new concepts. One of the goals of this paper is to help bridge the gap between applied mathematics, artificial intelligence and computer science on the one side and astronomy on the other.Comment: 24 pages, 8 Figures, 1 Table. Accepted for publication: "Advances in Astronomy, special issue "Robotic Astronomy

    2D-1D Wavelet Reconstruction As A Tool For Source Finding In Spectroscopic Imaging Surveys

    Full text link
    Today, image denoising by thresholding of wavelet coefficients is a commonly used tool for 2D image enhancement. Since the data product of spectroscopic imaging surveys has two spatial and one spectral dimension, the techniques for denoising have to be adapted to this change in dimensionality. In this paper we will review the basic method of denoising data by thresholding wavelet coefficients and implement a 2D-1D wavelet decomposition to obtain an efficient way of denoising spectroscopic data cubes. We conduct different simulations to evaluate the usefulness of the algorithm as part of a source finding pipeline.Comment: 8 pages, 7 figures, 1 table, accepted for publication in PASA Special Issue on Source Finding and Visualizatio

    Bounded PCA based Multi Sensor Image Fusion Employing Curvelet Transform Coefficients

    Get PDF
    The fusion of thermal and visible images acts as an important device for target detection. The quality of the spectral content of the fused image improves with wavelet-based image fusion. However, compared to PCA-based fusion, most wavelet-based methods provide results with a lower spatial resolution. The outcome gets better when the two approaches are combined, but they may still be refined. Compared to wavelets, the curvelet transforms more accurately depict the edges in the image. Enhancing the edges is a smart way to improve spatial resolution and the edges are crucial for interpreting the images. The fusion technique that utilizes curvelets enables the provision of additional data in both spectral and spatial areas concurrently. In this paper, we employ an amalgamation of Curvelet Transform and a Bounded PCA (CTBPCA) method to fuse thermal and visible images. To evidence the enhanced efficiency of our proposed technique, multiple evaluation metrics and comparisons with existing image merging methods are employed. Our approach outperforms others in both qualitative and quantitative analysis, except for runtime performance. Future Enhancement-The study will be based on using the fused image for target recognition. Future work should also focus on this method’s continued improvement and optimization for real-time video processing
    corecore