2,371 research outputs found

    Improvement of speech recognition by nonlinear noise reduction

    Full text link
    The success of nonlinear noise reduction applied to a single channel recording of human voice is measured in terms of the recognition rate of a commercial speech recognition program in comparison to the optimal linear filter. The overall performance of the nonlinear method is shown to be superior. We hence demonstrate that an algorithm which has its roots in the theory of nonlinear deterministic dynamics possesses a large potential in a realistic application.Comment: see urbanowicz.org.p

    Investment strategy due to the minimization of portfolio noise level by observations of coarse-grained entropy

    Full text link
    Using a recently developed method of noise level estimation that makes use of properties of the coarse grained-entropy we have analyzed the noise level for the Dow Jones index and a few stocks from the New York Stock Exchange. We have found that the noise level ranges from 40 to 80 percent of the signal variance. The condition of a minimal noise level has been applied to construct optimal portfolios from selected shares. We show that implementation of a corresponding threshold investment strategy leads to positive returns for historical data.Comment: 6 pages, 1 figure, 1 table, Proceedings of the conference APFA4. See http://www.chaosandnoise.or

    Anti-deterministic behavior of discrete systems that are less predictable than noise

    Full text link
    We present a new type of deterministic dynamical behaviour that is less predictable than white noise. We call it anti-deterministic (AD) because time series corresponding to the dynamics of such systems do not generate deterministic lines in Recurrence Plots for small thresholds. We show that although the dynamics is chaotic in the sense of exponential divergence of nearby initial conditions and although some properties of AD data are similar to white noise, the AD dynamics is in fact less predictable than noise and hence is different from pseudo-random number generators.Comment: 6 pages, 5 figures. See http://www.chaosandnoise.or

    Noise reduction in chaotic time series by a local projection with nonlinear constraints

    Full text link
    On the basis of a local-projective (LP) approach we develop a method of noise reduction in time series that makes use of nonlinear constraints appearing due to the deterministic character of the underlying dynamical system. The Delaunay triangulation approach is used to find the optimal nearest neighboring points in time series. The efficiency of our method is comparable to standard LP methods but our method is more robust to the input parameter estimation. The approach has been successfully applied for separating a signal from noise in the chaotic Henon and Lorenz models as well as for noisy experimental data obtained from an electronic Chua circuit. The method works properly for a mixture of additive and dynamical noise and can be used for the noise-level detection.Comment: 11 pages, 12 figures. See http://www.chaosandnoise.or

    Combustion Process in a Spark Ignition Engine: Dynamics and Noise Level Estimation

    Full text link
    We analyse the experimental time series of internal pressure in a four cylinder spark ignition engine. In our experiment, performed for different spark advance angles, apart from usual cyclic changes of engine pressure we observed oscillations. These oscillations are with longer time scales ranging from one to several hundred engine cycles depending on engine working conditions. Basing on the pressure time dependence we have calculated the heat released per cycle. Using the time series of heat release to calculate the correlation coarse-grained entropy we estimated the noise level for internal combustion process. Our results show that for a smaller spark advance angle the system is more deterministic.Comment: 15 pages, 7 figures, submitted to CHAO

    Evaluation of a Tree-based Pipeline Optimization Tool for Automating Data Science

    Full text link
    As the field of data science continues to grow, there will be an ever-increasing demand for tools that make machine learning accessible to non-experts. In this paper, we introduce the concept of tree-based pipeline optimization for automating one of the most tedious parts of machine learning---pipeline design. We implement an open source Tree-based Pipeline Optimization Tool (TPOT) in Python and demonstrate its effectiveness on a series of simulated and real-world benchmark data sets. In particular, we show that TPOT can design machine learning pipelines that provide a significant improvement over a basic machine learning analysis while requiring little to no input nor prior knowledge from the user. We also address the tendency for TPOT to design overly complex pipelines by integrating Pareto optimization, which produces compact pipelines without sacrificing classification accuracy. As such, this work represents an important step toward fully automating machine learning pipeline design.Comment: 8 pages, 5 figures, preprint to appear in GECCO 2016, edits not yet made from reviewer comment
    corecore