271 research outputs found

    Sparsity and morphological diversity for multivalued data analysis

    Get PDF
    International audienceThe recent development of multi-channel sensors has motivated interest in devising new methods for the coherent processing of multivariate data. An extensive work has already been dedicated to multivariate data processing ranging from blind source separation (BSS) to multi/hyper-spectral data restoration. Previous work1 has emphasized on the fundamental role played by sparsity and morphological diversity to enhance multichannel signal processing. GMCA is a recent algorithm for multichannel data analysis which was used successfully in a variety of applications including multichannel sparse decomposition, blind source separation (BSS), color image restoration and inpainting. Inspired by GMCA, a recently introduced algorithm coined HypGMCA is described for BSS applications in hyperspectral data processing. It assumes the collected data is a linear instantaneous mixture of components exhibiting sparse spectral signatures as well as sparse spatial morphologies, each in specified dictionaries of spectral and spatial waveforms. We report on numerical experiments with synthetic data and application to real observations which demonstrate the validity of the proposed method

    Blind Source Separation: the Sparsity Revolution

    Get PDF
    International audienceOver the last few years, the development of multi-channel sensors motivated interest in methods for the coherent processing of multivariate data. Some specific issues have already been addressed as testified by the wide literature on the so-called blind source separation (BSS) problem. In this context, as clearly emphasized by previous work, it is fundamental that the sources to be retrieved present some quantitatively measurable diversity. Recently, sparsity and morphological diversity have emerged as a novel and effective source of diversity for BSS. We give here some essential insights into the use of sparsity in source separation and we outline the essential role of morphological diversity as being a source of diversity or contrast between the sources. This paper overviews a sparsity-based BSS method coined Generalized Morphological Component Analysis (GMCA) that takes advantages of both morphological diversity and sparsity, using recent sparse overcomplete or redundant signal representations. GMCA is a fast and efficient blind source separation method. In remote sensing applications, the specificity of hyperspectral data should be accounted for. We extend the proposed GMCA framework to deal with hyperspectral data. In a general framework, GMCA provides a basis for multivariate data analysis in the scope of a wide range of classical multivariate data restorate. Numerical results are given in color image denoising and inpainting. Finally, GMCA is applied to the simulated ESA/Planck data. It is shown to give effective astrophysical component separation

    Source Separation in Chemical Analysis : Recent Achievements and Perspectives

    Get PDF
    International audienceSource separation is one of the most relevant estimation problems found in chemistry. Indeed, dealing with mixtures is paramount in different kinds of chemical analysis. For instance, there are some cases where the analyte is a chemical mixture of different components, e.g., in the analysis of rocks and heterogeneous materials through spectroscopy. Moreover, a mixing process can also take place even when the components are not chemically mixed. For instance, in ionic analysis of liquid samples, the ions are not chemically connected, but, due to the lack of selectivity of the chemical sensors, the acquired responses may be influenced by ions that are not the desired ones. Finally, there are some situations where the pure components cannot be isolated chemically since they appear only in the presence of other components. In this case, BSS may provide these components that cannot be retrieved otherwise. In this paper, our aim is to shed some light on the use of BSS in chemical analysis. In this context, we firstly provide a brief overview on source separation (Section II), with particular attention to the classes of linear and nonlinear mixing models (Sections III and IV, respectively). Then, (in Section V), we will give some conclusions and focus on challenging aspects that are found in chemical analysis. Although dealing with a relatively new field of applications, this article is not an exhaustive survey of source separation methods and algorithms, since there are solutions originated in closely related domains (e.g. remote sensing and hyperspectral imaging) that suit well several problems found in chemical analysis. Moreover, we do not discuss the supervised source separation methods, which are basically multivariate regression techniques, that one can find in chemometrics

    Watermarking security: theory and practice

    Get PDF
    This article proposes a theory of watermarking security based on a cryptanalysis point of view. The main idea is that information about the secret key leaks from the observations, for instance watermarked pieces of content, available to the opponent. Tools from information theory (Shannon's mutual information and Fisher's information matrix) can measure this leakage of information. The security level is then defined as the number of observations the attacker needs to successfully estimate the secret key. This theory is applied to two common watermarking methods: the substitutive scheme and the spread spectrum based techniques. Their security levels are calculated against three kinds of attack. The experimental work illustrates how Blind Source Separation (especially Independent Component Analysis) algorithms help the opponent exploiting this information leakage to disclose the secret carriers in the spread spectrum case. Simulations assess the security levels derived in the theoretical part of the article

    Improving Monitoring and Diagnosis for Process Control using Independent Component Analysis

    Get PDF
    Statistical Process Control (SPC) is the general field concerned with monitoring the operation and performance of systems. SPC consists of a collection of techniques for characterizing the operation of a system using a probability distribution consistent with the system\u27s inputs and outputs. Classical SPC monitors a single variable to characterize the operation of a single machine tool or process step using tools such as Shewart charts. The traditional approach works well for simple small to medium size processes. For more complex processes a number of multivariate SPC techniques have been developed in recent decades. These advanced methods suffer from several disadvantages compared to univariate techniques: they tend to be statistically less powerful, and they tend to complicate process diagnosis when a disturbance is detected. This research introduces a general method for simplifying multivariate process monitoring in such a manner as to allow the use of traditional SPC tools while facilitating process diagnosis. Latent variable representations of complex processes are developed which directly relate disturbances with process steps or segments. The method models disturbances in the process rather than the process itself. The basic tool used is Independent Component Analysis (ICA). The methodology is illustrated on the problem of monitoring Electrical Test (E-Test) data from a semiconductor manufacturing process. Development and production data from a working semiconductor plant are used to estimate a factor model that is then used to develop univariate control charts for particular types of process disturbances. Detection and false alarm rates for data with known disturbances are given. The charts correctly detect and classify all the disturbance cases with a very low false alarm rate. A secondary contribution is the introduction of a method for performing an ICA like analysis using possibilistic data instead of probabilistic data. This technique extends the general ICA framework to apply to a broader range of uncertainty types. Further development of this technique could lead to the capability to use extremely sparse data to estimate ICA process models

    Robust Wireless Localization in Harsh Mixed Line-of-Sight/Non-Line-of-Sight Environments

    Get PDF
    This PhD thesis considers the problem of locating some target nodes in different wireless infrastructures such as wireless cellular radio networks and wireless sensor networks. To be as realistic as possible, mixed line-of-sight and non-line-of-sight (LOS/NLOS) localization environment is introduced. Both the conventional non-cooperative localization and the new emerging cooperative localization have been studied thoroughly. Owing to the random nature of the measurements, probabilistic methods are more advanced as compared to the old-fashioned geometric methods. The gist behind the probabilistic methods is to infer the unknown positions of the target nodes in an estimation process, given a set of noisy position related measurements, a probabilistic measurement model, and a few known reference positions. In contrast to the majority of the existing methods, harsh but practical constraints are taken into account: neither offline calibration nor non-line-of-sight state identification is equipped in the desired localization system. This leads to incomplete knowledge about the measurement error statistics making the inference task extremely challenging. Two new classes of localization algorithms have been proposed to jointly estimate the positions and measurement error statistics. All unknown parameters are assumed to be deterministic, and maximum likelihood estimator is sought after throughout this thesis. The first class of algorithms assumes no knowledge about the measurement error distribution and adopts a nonparametric modeling. The idea is to alternate between a pdf estimation step, which approximates the exact measurement error pdf via adaptive kernel density estimation, and a parameter estimation step, which resolves a position estimate numerically from an approximated log-likelihood function. The computational complexity of this class of algorithms scales quadratically in the number of measurements. Hence, the first class of algorithms is applicable primarily for non-cooperative localization in wireless cellular radio networks. In order to reduce the computational complexity, a second class of algorithms resorts to approximate the measurement error distribution parametrically as a linear combination of Gaussian distributions. Iterative algorithms that alternate between updating the position(s) and other parameters have been developed with the aid of expectation-maximization (EM), expectation conditional maximization (ECM) and joint maximum a posterior-maximum likelihood (JMAP-ML) criteria. As a consequence, the computational complexity turns out to scale linearly in the number of measurements. Hence, the second class of algorithms is also applicable for cooperative localization in wireless sensor networks. Apart from the algorithm design, systematical analyses in terms of Cramer-Rao lower bound, computational complexity, and communication energy consumption have also been conducted for comprehensive algorithm evaluations. Simulation and experimental results have demonstrated that the proposed algorithms all tend to achieve the fundamental limits of the localization accuracy for large data records and outperform their competitors by far when model mismatch problems can be ignored

    Blind Source Separation for the Processing of Contact-Less Biosignals

    Get PDF
    (Spatio-temporale) Blind Source Separation (BSS) eignet sich für die Verarbeitung von Multikanal-Messungen im Bereich der kontaktlosen Biosignalerfassung. Ziel der BSS ist dabei die Trennung von (z.B. kardialen) Nutzsignalen und Störsignalen typisch für die kontaktlosen Messtechniken. Das Potential der BSS kann praktisch nur ausgeschöpft werden, wenn (1) ein geeignetes BSS-Modell verwendet wird, welches der Komplexität der Multikanal-Messung gerecht wird und (2) die unbestimmte Permutation unter den BSS-Ausgangssignalen gelöst wird, d.h. das Nutzsignal praktisch automatisiert identifiziert werden kann. Die vorliegende Arbeit entwirft ein Framework, mit dessen Hilfe die Effizienz von BSS-Algorithmen im Kontext des kamera-basierten Photoplethysmogramms bewertet werden kann. Empfehlungen zur Auswahl bestimmter Algorithmen im Zusammenhang mit spezifischen Signal-Charakteristiken werden abgeleitet. Außerdem werden im Rahmen der Arbeit Konzepte für die automatisierte Kanalauswahl nach BSS im Bereich der kontaktlosen Messung des Elektrokardiogramms entwickelt und bewertet. Neuartige Algorithmen basierend auf Sparse Coding erwiesen sich dabei als besonders effizient im Vergleich zu Standard-Methoden.(Spatio-temporal) Blind Source Separation (BSS) provides a large potential to process distorted multichannel biosignal measurements in the context of novel contact-less recording techniques for separating distortions from the cardiac signal of interest. This potential can only be practically utilized (1) if a BSS model is applied that matches the complexity of the measurement, i.e. the signal mixture and (2) if permutation indeterminacy is solved among the BSS output components, i.e the component of interest can be practically selected. The present work, first, designs a framework to assess the efficacy of BSS algorithms in the context of the camera-based photoplethysmogram (cbPPG) and characterizes multiple BSS algorithms, accordingly. Algorithm selection recommendations for certain mixture characteristics are derived. Second, the present work develops and evaluates concepts to solve permutation indeterminacy for BSS outputs of contact-less electrocardiogram (ECG) recordings. The novel approach based on sparse coding is shown to outperform the existing concepts of higher order moments and frequency-domain features
    corecore