22,846 research outputs found

    BLIND SOURCE SEPARATION USING MAXIMUM ENTROPY PDF ESTIMATION BASED ON FRACTIONAL MOMENTS

    Get PDF
    Abstract. Recovering a set of independent sources which are linearly mixed is the main task of the blind source separation. Utilizing different methods such as infomax principle, mutual information and maximum likelihood leads to simple iterative procedures such as natural gradient algorithms. These algorithms depend on a nonlinear function (known as score or activation function) of source distributions. Since there is no prior knowledge of source distributions, the optimality of the algorithms is based on the choice of a suitable parametric density model. In this paper, we propose an adaptive optimal score function based on the fractional moments of the sources. In order to obtain a parametric model for the source distributions, we use a few sampled fractional moments to construct the maximum entropy probability density function (PDF) estimation . By applying an optimization method we can obtain the optimal fractional moments that best fit the source distributions. Using the fractional moments (FM) instead of the integer moments causes the maximum entropy estimated PDF to converge to the true PDF much faster . The simulation results show that unlike the most previous proposed models for the nonlinear score function, which are limited to some sorts of source families such as sub-gaussian and super-gaussian or some forms of source distribution models such as generalized gaussian distribution, our new model achieves better results for every source signal without any prior assumption for its randomness behavior

    Blind image separation based on exponentiated transmuted Weibull distribution

    Full text link
    In recent years the processing of blind image separation has been investigated. As a result, a number of feature extraction algorithms for direct application of such image structures have been developed. For example, separation of mixed fingerprints found in any crime scene, in which a mixture of two or more fingerprints may be obtained, for identification, we have to separate them. In this paper, we have proposed a new technique for separating a multiple mixed images based on exponentiated transmuted Weibull distribution. To adaptively estimate the parameters of such score functions, an efficient method based on maximum likelihood and genetic algorithm will be used. We also calculate the accuracy of this proposed distribution and compare the algorithmic performance using the efficient approach with other previous generalized distributions. We find from the numerical results that the proposed distribution has flexibility and an efficient resultComment: 14 pages, 12 figures, 4 tables. International Journal of Computer Science and Information Security (IJCSIS),Vol. 14, No. 3, March 2016 (pp. 423-433

    BMICA-independent component analysis based on B-spline mutual information estimator

    Get PDF
    The information theoretic concept of mutual information provides a general framework to evaluate dependencies between variables. Its estimation however using B-Spline has not been used before in creating an approach for Independent Component Analysis. In this paper we present a B-Spline estimator for mutual information to find the independent components in mixed signals. Tested using electroencephalography (EEG) signals the resulting BMICA (B-Spline Mutual Information Independent Component Analysis) exhibits better performance than the standard Independent Component Analysis algorithms of FastICA, JADE, SOBI and EFICA in similar simulations. BMICA was found to be also more reliable than the 'renown' FastICA

    Simulations for single-dish intensity mapping experiments

    Full text link
    HI intensity mapping is an emerging tool to probe dark energy. Observations of the redshifted HI signal will be contaminated by instrumental noise, atmospheric and Galactic foregrounds. The latter is expected to be four orders of magnitude brighter than the HI emission we wish to detect. We present a simulation of single-dish observations including an instrumental noise model with 1/f and white noise, and sky emission with a diffuse Galactic foreground and HI emission. We consider two foreground cleaning methods: spectral parametric fitting and principal component analysis. For a smooth frequency spectrum of the foreground and instrumental effects, we find that the parametric fitting method provides residuals that are still contaminated by foreground and 1/f noise, but the principal component analysis can remove this contamination down to the thermal noise level. This method is robust for a range of different models of foreground and noise, and so constitutes a promising way to recover the HI signal from the data. However, it induces a leakage of the cosmological signal into the subtracted foreground of around 5%. The efficiency of the component separation methods depends heavily on the smoothness of the frequency spectrum of the foreground and the 1/f noise. We find that as, long as the spectral variations over the band are slow compared to the channel width, the foreground cleaning method still works.Comment: 14 pages, 12 figures. Submitted to MNRA
    • …
    corecore