2,794 research outputs found

    Decorrelation of Neutral Vector Variables: Theory and Applications

    Full text link
    In this paper, we propose novel strategies for neutral vector variable decorrelation. Two fundamental invertible transformations, namely serial nonlinear transformation and parallel nonlinear transformation, are proposed to carry out the decorrelation. For a neutral vector variable, which is not multivariate Gaussian distributed, the conventional principal component analysis (PCA) cannot yield mutually independent scalar variables. With the two proposed transformations, a highly negatively correlated neutral vector can be transformed to a set of mutually independent scalar variables with the same degrees of freedom. We also evaluate the decorrelation performances for the vectors generated from a single Dirichlet distribution and a mixture of Dirichlet distributions. The mutual independence is verified with the distance correlation measurement. The advantages of the proposed decorrelation strategies are intensively studied and demonstrated with synthesized data and practical application evaluations

    Separation of multiple signals in hearing aids by output decorrelation and time-delay estimation

    Get PDF

    Understanding Slow Feature Analysis: A Mathematical Framework

    Get PDF
    Slow feature analysis is an algorithm for unsupervised learning of invariant representations from data with temporal correlations. Here, we present a mathematical analysis of slow feature analysis for the case where the input-output functions are not restricted in complexity. We show that the optimal functions obey a partial differential eigenvalue problem of a type that is common in theoretical physics. This analogy allows the transfer of mathematical techniques and intuitions from physics to concrete applications of slow feature analysis, thereby providing the means for analytical predictions and a better understanding of simulation results. We put particular emphasis on the situation where the input data are generated from a set of statistically independent sources.\ud The dependence of the optimal functions on the sources is calculated analytically for the cases where the sources have Gaussian or uniform distribution

    On the rate-distortion performance and computational efficiency of the Karhunen-Loeve transform for lossy data compression

    Get PDF
    We examine the rate-distortion performance and computational complexity of linear transforms for lossy data compression. The goal is to better understand the performance/complexity tradeoffs associated with using the Karhunen-Loeve transform (KLT) and its fast approximations. Since the optimal transform for transform coding is unknown in general, we investigate the performance penalties associated with using the KLT by examining cases where the KLT fails, developing a new transform that corrects the KLT's failures in those examples, and then empirically testing the performance difference between this new transform and the KLT. Experiments demonstrate that while the worst KLT can yield transform coding performance at least 3 dB worse than that of alternative block transforms, the performance penalty associated with using the KLT on real data sets seems to be significantly smaller, giving at most 0.5 dB difference in our experiments. The KLT and its fast variations studied here range in complexity requirements from O(n^2) to O(n log n) in coding vectors of dimension n. We empirically investigate the rate-distortion performance tradeoffs associated with traversing this range of options. For example, an algorithm with complexity O(n^3/2) and memory O(n) gives 0.4 dB performance loss relative to the full KLT in our image compression experiment

    Experimental study of Taylor's hypothesis in a turbulent soap film

    Get PDF
    An experimental study of Taylor's hypothesis in a quasi-two-dimensional turbulent soap film is presented. A two probe laser Doppler velocimeter enables a non-intrusive simultaneous measurement of the velocity at spatially separated points. The breakdown of Taylor's hypothesis is quantified using the cross correlation between two points displaced in both space and time; correlation is better than 90% for scales less than the integral scale. A quantitative study of the decorrelation beyond the integral scale is presented, including an analysis of the failure of Taylor's hypothesis using techniques from predictability studies of turbulent flows. Our results are compared with similar studies of 3D turbulence.Comment: 27 pages, + 19 figure

    On the Comparisons of Decorrelation Approaches for Non-Gaussian Neutral Vector Variables

    Get PDF

    Running bumps from stealth bosons

    Full text link
    For the "stealth bosons" SS, light boosted particles with a decay S→AA→qqˉqqˉS \to A A \to q \bar q q \bar q into four quarks and reconstructed as a single fat jet, the groomed jet mass has a strong correlation with groomed jet substructure variables. Consequently, the jet mass distribution is strongly affected by the jet substructure selection cuts when applied on the groomed jet. We illustrate this fact by recasting a CMS search for low-mass dijet resonances and show a few representative examples. The mass distributions exhibit narrow and wide bumps at several locations in the 100 - 300 GeV range, between the masses of the daughter particles AA and the parent particle SS, depending on the jet substructure selection. This striking observation introduces several caveats when interpreting and comparing experimental results, for the case of non-standard signatures. The possibility that a single boosted particle decaying hadronically produces multiple bumps, at quite different jet masses, and depending on the event selection, brings the game of anomaly chasing to the next level.Comment: LaTeX 21 pages. Added one appendix and some plots. Journal versio
    • …
    corecore