465,217 research outputs found

    Convolutional Dictionary Learning through Tensor Factorization

    Get PDF
    Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable models such as topic models, independent component analysis and dictionary learning. Model parameters are estimated via CP decomposition of the observed higher order input moments. However, in many domains, additional invariances such as shift invariances exist, enforced via models such as convolutional dictionary learning. In this paper, we develop novel tensor decomposition algorithms for parameter estimation of convolutional models. Our algorithm is based on the popular alternating least squares method, but with efficient projections onto the space of stacked circulant matrices. Our method is embarrassingly parallel and consists of simple operations such as fast Fourier transforms and matrix multiplications. Our algorithm converges to the dictionary much faster and more accurately compared to the alternating minimization over filters and activation maps

    Frontmatter

    Get PDF
    Independent component analysis (ICA) is a well-known technique for blind source separation (BSS) based on higher-order statistics. Assuming the sources are independent, the fact that the sources can be uniquely recovered is a consequence of the unicity the canonical polyadic decomposition (CPD), a decomposition of a tensor in rank-one terms. Contrary to matrix decompositions, tensor decompositions can be unique without imposing additional constraints such as orthogonality or nonnegativity. Block component analysis (BCA) is an entirely new technique for BSS and factor analysis which also exploits the unicity of tensor decompositions, but does not make the assumption that the sources should be statistically independent. In this poster, we explain how BCA works and how to compute the associated block term decompositions (BTD). The latter is a generalization of the CPD and multilinear SVD (MLSVD), which are in turn the two most prominent generalizations of the matrix SVD to tensors. Block term decompositions can be formulated as nonlinear optimization problems in which the objective function is not analytic in its complex variables. We generalized several well-known unconstrained optimization methods such as (L-)BFGS, NCG and nonlinear least squares methods such as Levenberg-Marquardt to this class of problems and applied them to compute a certain type of BTD. In the nonlinear least squares methods, we exploited the problem's structure, which resulted in some of the most efficient algorithms for BTDs to date.status: publishe

    Process monitoring and fault detection on a hot-melt extrusion process using in-line Raman spectroscopy and a hybrid soft sensor

    Get PDF
    We propose a real-time process monitoring and fault detection scheme for a pharmaceutical hot-melt extrusion process producing Paracetamol-Affinisol extrudate. The scheme involves prediction of Paracetamol concentration from two independent sources: a hybrid soft sensor and a Raman-based Partial Least Squares (PLS) calibration model. Both these predictions are used by the developed PCA (Principal Component Analysis) and SPC (Statistical Process Control) monitors to detect process faults and raise alarms. Through real-time extrusion results, it is shown that this two-sensor approach enables the detection of various common process faults which would otherwise remain undetected with a single-sensor monitoring scheme
    • …
    corecore