60 research outputs found

    Hybrid solutions to instantaneous MIMO blind separation and decoding: narrowband, QAM and square cases

    Get PDF
    Future wireless communication systems are desired to support high data rates and high quality transmission when considering the growing multimedia applications. Increasing the channel throughput leads to the multiple input and multiple output and blind equalization techniques in recent years. Thereby blind MIMO equalization has attracted a great interest.Both system performance and computational complexities play important roles in real time communications. Reducing the computational load and providing accurate performances are the main challenges in present systems. In this thesis, a hybrid method which can provide an affordable complexity with good performance for Blind Equalization in large constellation MIMO systems is proposed first. Saving computational cost happens both in the signal sep- aration part and in signal detection part. First, based on Quadrature amplitude modulation signal characteristics, an efficient and simple nonlinear function for the Independent Compo- nent Analysis is introduced. Second, using the idea of the sphere decoding, we choose the soft information of channels in a sphere, and overcome the so- called curse of dimensionality of the Expectation Maximization (EM) algorithm and enhance the final results simultaneously. Mathematically, we demonstrate in the digital communication cases, the EM algorithm shows Newton -like convergence.Despite the widespread use of forward -error coding (FEC), most multiple input multiple output (MIMO) blind channel estimation techniques ignore its presence, and instead make the sim- plifying assumption that the transmitted symbols are uncoded. However, FEC induces code structure in the transmitted sequence that can be exploited to improve blind MIMO channel estimates. In final part of this work, we exploit the iterative channel estimation and decoding performance for blind MIMO equalization. Experiments show the improvements achievable by exploiting the existence of coding structures and that it can access the performance of a BCJR equalizer with perfect channel information in a reasonable SNR range. All results are confirmed experimentally for the example of blind equalization in block fading MIMO systems

    Studies on Kernel Learning and Independent Component Analysis

    Get PDF
    A crucial step in kernel-based learning is the selection of a proper kernel function or kernel matrix. Multiple kernel learning (MKL), in which a set of kernels are assessed during the learning time, was recently proposed to solve the kernel selection problem. The goal is to estimate a suitable kernel matrix by adjusting a linear combination of the given kernels so that the empirical risk is minimized. MKL is usually a memory demanding optimization problem, which becomes a barrier for large samples. This study proposes an efficient method for kernel learning by using the low rank property of large kernel matrices which is often observed in applications. The proposed method involves selecting a few eigenvectors of kernel bases and taking a sparse combination of them by minimizing the empirical risk. Empirical results show that the computational demands decrease significantly without compromising classification accuracy, when compared with previous MKL methods. Computing an upper bound for complexity of the hypothesis set generated by the learned kernel as above is challenging. Here, a novel bound is presented which shows that the Gaussian complexity of such hypothesis set is controlled by the logarithm of the number of involved eigenvectors and their maximum distance, i.e. the geometry of the basis set. This geometric bound sheds more light on the selection of kernel bases, which could not be obtained from previous results. The rest of this study is a step toward utilizing the statistical learning theory to analyze independent component analysis estimators such as FastICA. This thesis provides a sample convergence analysis for FastICA estimator and shows that the estimations converge in distribution as the number of samples increase. Additionally, similar results for the bootstrap FastICA are established. A direct application of these results is to design a hypothesis testing to study the convergence of the estimates

    Independent component analysis for non-standard data structures

    Get PDF
    Independent component analysis is a classical multivariate tool used for estimating independent sources among collections of mixed signals. However, modern forms of data are typically too complex for the basic theory to adequately handle. In this thesis extensions of independent component analysis to three cases of non-standard data structures are developed: noisy multivariate data, tensor-valued data and multivariate functional data. In each case we define the corresponding independent component model along with the related assumptions and implications. The proposed estimators are mostly based on the use of kurtosis and its analogues for the considered structures, resulting into functionals of rather unified form, regardless of the type of the data. We prove the Fisher consistencies of the estimators and particular weight is given to their limiting distributions, using which comparisons between the methods are also made.Riippumattomien komponenttien analyysi on moniulotteisen tilastotieteen työkalu,jota käytetään estimoimaan riippumattomia lähdesignaaleja sekoitettujen signaalien joukosta. Modernit havaintoaineistot ovat kuitenkin tyypillisesti rakenteeltaan liian monimutkaisia, jotta niitä voitaisiin lähestyä alan perinteisillä menetelmillä. Tässä väitöskirjatyössä esitellään laajennukset riippumattomien komponenttien analyysin teoriasta kolmelle epästandardille aineiston muodolle: kohinaiselle moniulotteiselle datalle, tensoriarvoiselle datalle ja moniulotteiselle funktionaaliselle datalle. Kaikissa tapauksissa määriteläään vastaava riippumattomien komponenttien malli oletuksineen ja seurauksineen. Esitellyt estimaattorit pohjautuvat enimmäkseen huipukkuuden ja sen laajennuksien käyttöönottoon ja saatavat funktionaalit ovat analyyttisesti varsin yhtenäisen muotoisia riippumatta aineiston tyypistä. Kaikille estimaattoreille näytetään niiden Fisher-konsistenttisuus ja painotettuna on erityisesti estimaattoreiden rajajakaumat, jotka mahdollistavat teoreettiset vertailut eri menetelmien välillä

    Blind Source Separation for the Processing of Contact-Less Biosignals

    Get PDF
    (Spatio-temporale) Blind Source Separation (BSS) eignet sich für die Verarbeitung von Multikanal-Messungen im Bereich der kontaktlosen Biosignalerfassung. Ziel der BSS ist dabei die Trennung von (z.B. kardialen) Nutzsignalen und Störsignalen typisch für die kontaktlosen Messtechniken. Das Potential der BSS kann praktisch nur ausgeschöpft werden, wenn (1) ein geeignetes BSS-Modell verwendet wird, welches der Komplexität der Multikanal-Messung gerecht wird und (2) die unbestimmte Permutation unter den BSS-Ausgangssignalen gelöst wird, d.h. das Nutzsignal praktisch automatisiert identifiziert werden kann. Die vorliegende Arbeit entwirft ein Framework, mit dessen Hilfe die Effizienz von BSS-Algorithmen im Kontext des kamera-basierten Photoplethysmogramms bewertet werden kann. Empfehlungen zur Auswahl bestimmter Algorithmen im Zusammenhang mit spezifischen Signal-Charakteristiken werden abgeleitet. Außerdem werden im Rahmen der Arbeit Konzepte für die automatisierte Kanalauswahl nach BSS im Bereich der kontaktlosen Messung des Elektrokardiogramms entwickelt und bewertet. Neuartige Algorithmen basierend auf Sparse Coding erwiesen sich dabei als besonders effizient im Vergleich zu Standard-Methoden.(Spatio-temporal) Blind Source Separation (BSS) provides a large potential to process distorted multichannel biosignal measurements in the context of novel contact-less recording techniques for separating distortions from the cardiac signal of interest. This potential can only be practically utilized (1) if a BSS model is applied that matches the complexity of the measurement, i.e. the signal mixture and (2) if permutation indeterminacy is solved among the BSS output components, i.e the component of interest can be practically selected. The present work, first, designs a framework to assess the efficacy of BSS algorithms in the context of the camera-based photoplethysmogram (cbPPG) and characterizes multiple BSS algorithms, accordingly. Algorithm selection recommendations for certain mixture characteristics are derived. Second, the present work develops and evaluates concepts to solve permutation indeterminacy for BSS outputs of contact-less electrocardiogram (ECG) recordings. The novel approach based on sparse coding is shown to outperform the existing concepts of higher order moments and frequency-domain features

    Fast Kernel Smoothing in R with Applications to Projection Pursuit

    Get PDF
    This paper introduces the R package FKSUM, which offers fast and exact evaluation of univariate kernel smoothers. The main kernel computations are implemented in C++, and are wrapped in simple, intuitive and versatile R functions. The fast kernel computations are based on recursive expressions involving the order statistics, which allows for exact evaluation of kernel smoothers at all sample points in log-linear time. In addition to general purpose kernel smoothing functions, the package offers purpose built and readyto-use implementations of popular kernel-type estimators. On top of these basic smoothing problems, this paper focuses on projection pursuit problems in which the projection index is based on kernel-type estimators of functionals of the projected density

    Adaptive methods for score function modeling in blind source separation

    Get PDF
    In signal processing and related fields, multichannel measurements are often encountered. Depending on the application, for instance, multiple antennas, multiple microphones or multiple biomedical sensors are used for the data acquisition. Such systems can be described using Multiple-Input Multiple-Output (MIMO) system models. In many cases, several source signals are present at the same time and there is only limited knowledge of their properties and how they contribute to each sensor output. If the source signals and the physical system are unknown and only the sensor outputs are observed, the processing methods developed for recovering the original signals are called blind. In Blind Source Separation (BSS) the goal is to recover the source signals from the observed mixed signals (mixtures). Blindness means that neither the sources nor the mixing system is known. Separation can be based on the theoretically limiting but practically feasible assumption that the sources are statistically independent. This assumption connects BSS and Independent Component Analysis (ICA). The usage of mutual information as a measure of independence leads to iterative estimation of the score functions of the mixtures. The purpose of this thesis is to develop BSS methods that can adapt to different source distributions. Adaptation makes it possible to separate sources without knowing the source distributions or even the characteristics of source distributions. Special attention is paid to methods that allow also asymmetric source distributions. Asymmetric distributions occur in important applications such as communications and biomedical signal processing. Adaptive techniques are proposed for the modeling of score functions or estimating functions. Three approaches based on the Pearson system, the Extended Generalized Lambda Distribution (EGLD) and adaptively combined fixed estimating functions are proposed. The Pearson system and the EGLD are parametric families of distributions and they are used to model the distributions of the mixtures. The strength of these parametric families is that they contain a wide class of distributions, including asymmetric distributions with positive and negative kurtosis, while the estimation of the parameters is still a relatively simple procedure. The methods may be implemented using existing ICA algorithms. The reliable performance of the proposed methods is demonstrated in extensive simulations. In addition to symmetric source distributions, asymmetric distributions, such as Rayleigh and lognormal distribution, are utilized in simulations. The score adaptive methods outperform commonly used methods due to their ability to adapt to asymmetric distributions.reviewe

    Independent EEG Sources Are Dipolar

    Get PDF
    Independent component analysis (ICA) and blind source separation (BSS) methods are increasingly used to separate individual brain and non-brain source signals mixed by volume conduction in electroencephalographic (EEG) and other electrophysiological recordings. We compared results of decomposing thirteen 71-channel human scalp EEG datasets by 22 ICA and BSS algorithms, assessing the pairwise mutual information (PMI) in scalp channel pairs, the remaining PMI in component pairs, the overall mutual information reduction (MIR) effected by each decomposition, and decomposition ‘dipolarity’ defined as the number of component scalp maps matching the projection of a single equivalent dipole with less than a given residual variance. The least well-performing algorithm was principal component analysis (PCA); best performing were AMICA and other likelihood/mutual information based ICA methods. Though these and other commonly-used decomposition methods returned many similar components, across 18 ICA/BSS algorithms mean dipolarity varied linearly with both MIR and with PMI remaining between the resulting component time courses, a result compatible with an interpretation of many maximally independent EEG components as being volume-conducted projections of partially-synchronous local cortical field activity within single compact cortical domains. To encourage further method comparisons, the data and software used to prepare the results have been made available (http://sccn.ucsd.edu/wiki/BSSComparison)

    Inferential Modeling and Independent Component Analysis for Redundant Sensor Validation

    Get PDF
    The calibration of redundant safety critical sensors in nuclear power plants is a manual task that consumes valuable time and resources. Automated, data-driven techniques, to monitor the calibration of redundant sensors have been developed over the last two decades, but have not been fully implemented. Parity space methods such as the Instrumentation and Calibration Monitoring Program (ICMP) method developed by Electric Power Research Institute and other empirical based inferential modeling techniques have been developed but have not become viable options. Existing solutions to the redundant sensor validation problem have several major flaws that restrict their applications. Parity space method, such as ICMP, are not robust for low redundancy conditions and their operation becomes invalid when there are only two redundant sensors. Empirical based inferential modeling is only valid when intrinsic correlations between predictor variables and response variables remain static during the model training and testing phase. They also commonly produce high variance results and are not the optimal solution to the problem. This dissertation develops and implements independent component analysis (ICA) for redundant sensor validation. Performance of the ICA algorithm produces sufficiently low residual variance parameter estimates when compared to simple averaging, ICMP, and principal component regression (PCR) techniques. For stationary signals, it can detect and isolate sensor drifts for as few as two redundant sensors. It is fast and can be embedded into a real-time system. This is demonstrated on a water level control system. Additionally, ICA has been merged with inferential modeling technique such as PCR to reduce the prediction error and spillover effects from data anomalies. ICA is easy to use with, only the window size needing specification. The effectiveness and robustness of the ICA technique is shown through the use of actual nuclear power plant data. A bootstrap technique is used to estimate the prediction uncertainties and validate its usefulness. Bootstrap uncertainty estimates incorporate uncertainties from both data and the model. Thus, the uncertainty estimation is robust and varies from data set to data set. The ICA based system is proven to be accurate and robust; however, classical ICA algorithms commonly fail when distributions are multi-modal. This most likely occurs during highly non-stationary transients. This research also developed a unity check technique which indicates such failures and applies other, more robust techniques during transients. For linear trending signals, a rotation transform is found useful while standard averaging techniques are used during general transients
    • …
    corecore