240,351 research outputs found

    Sequential Logistic Principal Component Analysis (SLPCA): Dimensional Reduction in Streaming Multivariate Binary-State System

    Full text link
    Sequential or online dimensional reduction is of interests due to the explosion of streaming data based applications and the requirement of adaptive statistical modeling, in many emerging fields, such as the modeling of energy end-use profile. Principal Component Analysis (PCA), is the classical way of dimensional reduction. However, traditional Singular Value Decomposition (SVD) based PCA fails to model data which largely deviates from Gaussian distribution. The Bregman Divergence was recently introduced to achieve a generalized PCA framework. If the random variable under dimensional reduction follows Bernoulli distribution, which occurs in many emerging fields, the generalized PCA is called Logistic PCA (LPCA). In this paper, we extend the batch LPCA to a sequential version (i.e. SLPCA), based on the sequential convex optimization theory. The convergence property of this algorithm is discussed compared to the batch version of LPCA (i.e. BLPCA), as well as its performance in reducing the dimension for multivariate binary-state systems. Its application in building energy end-use profile modeling is also investigated.Comment: 6 pages, 4 figures, conference submissio

    Sparse logistic principal components analysis for binary data

    Get PDF
    We develop a new principal components analysis (PCA) type dimension reduction method for binary data. Different from the standard PCA which is defined on the observed data, the proposed PCA is defined on the logit transform of the success probabilities of the binary observations. Sparsity is introduced to the principal component (PC) loading vectors for enhanced interpretability and more stable extraction of the principal components. Our sparse PCA is formulated as solving an optimization problem with a criterion function motivated from a penalized Bernoulli likelihood. A Majorization--Minimization algorithm is developed to efficiently solve the optimization problem. The effectiveness of the proposed sparse logistic PCA method is illustrated by application to a single nucleotide polymorphism data set and a simulation study.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS327 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Dynamic gesture recognition using PCA with multi-scale theory and HMM

    Get PDF
    In this paper, a dynamic gesture recognition system is presented which requires no special hardware other than a Webcam. The system is based on a novel method combining Principal Component Analysis (PCA) with hierarchical multi-scale theory and Discrete Hidden Markov Models (DHMM). We use a hierarchical decision tree based on multiscale theory. Firstly we convolve all members of the training data with a Gaussian kernel, which blurs differences between images and reduces their separation in feature space. This reduces the number of eigenvectors needed to describe the data. A principal component space is computed from the convolved data. We divide the data in this space into two clusters using the k-means algorithm. Then the level of blurring is reduced and PCA is applied to each of the clusters separately. A new principal component space is formed from each cluster. Each of these spaces is then divided into two and the process is repeated. We thus produce a binary tree of principal component spaces where each level of the tree represents a different degree of blurring. The search time is then proportional to the depth of the tree, which makes it possible to search hundreds of gestures in real time. The output of the decision tree is then input into DHMM to recognize temporal information

    Unsupervised spectral decomposition of X-ray binaries with application to GX 339-4

    Full text link
    In this paper we explore unsupervised spectral decomposition methods for distinguishing the effect of different spectral components for a set of consecutive spectra from an X-ray binary. We use well-established linear methods for the decomposition, namely principal component analysis, independent component analysis and non-negative matrix factorisation (NMF). Applying these methods to a simulated dataset consisting of a variable multicolour disc black body and a cutoff power law, we find that NMF outperforms the other two methods in distinguishing the spectral components. In addition, due the non-negative nature of NMF, the resulting components may be fitted separately, revealing the evolution of individual parameters. To test the NMF method on a real source, we analyse data from the low-mass X-ray binary GX 339-4 and found the results to match those of previous studies. In addition, we found the inner radius of the accretion disc to be located at the innermost stable circular orbit in the intermediate state right after the outburst peak. This study shows that using unsupervised spectral decomposition methods results in detecting the separate component fluxes down to low flux levels. Also, these methods provide an alternative way of detecting the spectral components without performing actual spectral fitting, which may prove to be practical when dealing with large datasets.Comment: 12 pages, 13 figure

    Matching pursuit-based compressive sensing in a wearable biomedical accelerometer fall diagnosis device

    Get PDF
    There is a significant high fall risk population, where individuals are susceptible to frequent falls and obtaining significant injury, where quick medical response and fall information are critical to providing efficient aid. This article presents an evaluation of compressive sensing techniques in an accelerometer-based intelligent fall detection system modelled on a wearable Shimmer biomedical embedded computing device with Matlab. The presented fall detection system utilises a database of fall and activities of daily living signals evaluated with discrete wavelet transforms and principal component analysis to obtain binary tree classifiers for fall evaluation. 14 test subjects undertook various fall and activities of daily living experiments with a Shimmer device to generate data for principal component analysis-based fall classifiers and evaluate the proposed fall analysis system. The presented system obtains highly accurate fall detection results, demonstrating significant advantages in comparison with the thresholding method presented. Additionally, the presented approach offers advantageous fall diagnostic information. Furthermore, transmitted data accounts for over 80% battery current usage of the Shimmer device, hence it is critical the acceleration data is reduced to increase transmission efficiency and in-turn improve battery usage performance. Various Matching pursuit-based compressive sensing techniques have been utilised to significantly reduce acceleration information required for transmission.Scopu
    corecore