2,294,785 research outputs found

    Investigating computational models of perceptual attack time

    Get PDF
    The perceptual attack time (PAT) is the compensation for differing attack components of sounds, in the case of seeking a perceptually isochronous presentation of sounds. It has applications in scheduling and is related to, but not necessarily the same as, the moment of perceptual onset. This paper describes a computational investigation of PAT over a set of 25 synthesised stimuli, and a larger database of 100 sounds equally divided into synthesised and ecological. Ground truth PATs for modeling were obtained by the alternating presentation paradigm, where subjects adjusted the relative start time of a reference click and the sound to be judged. Whilst fitting experimental data from the 25 sound set was plausible, difficulties with existing models were found in the case of the larger test set. A pragmatic solution was obtained using a neural net architecture. In general, learnt schema of sound classification may be implicated in resolving the multiple detection cues evoked by complex sounds

    Computational-time reduction of fourier-based analytical models

    Get PDF

    Computational Topology Techniques for Characterizing Time-Series Data

    Full text link
    Topological data analysis (TDA), while abstract, allows a characterization of time-series data obtained from nonlinear and complex dynamical systems. Though it is surprising that such an abstract measure of structure - counting pieces and holes - could be useful for real-world data, TDA lets us compare different systems, and even do membership testing or change-point detection. However, TDA is computationally expensive and involves a number of free parameters. This complexity can be obviated by coarse-graining, using a construct called the witness complex. The parametric dependence gives rise to the concept of persistent homology: how shape changes with scale. Its results allow us to distinguish time-series data from different systems - e.g., the same note played on different musical instruments.Comment: 12 pages, 6 Figures, 1 Table, The Sixteenth International Symposium on Intelligent Data Analysis (IDA 2017

    Computational complexity of the landscape II - Cosmological considerations

    Full text link
    We propose a new approach for multiverse analysis based on computational complexity, which leads to a new family of "computational" measure factors. By defining a cosmology as a space-time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it. We argue for a principle of "limited computational complexity" governing early universe dynamics as simulated by this supercomputer, which translates to a global measure for regulating the infinities of eternal inflation. The rules for time evolution can be thought of as a search algorithm, whose details should be constrained by a stronger principle of "minimal computational complexity." Unlike previously studied global measures, ours avoids standard equilibrium considerations and the well-known problems of Boltzmann Brains and the youngness paradox. We also give various definitions of the computational complexity of a cosmology, and argue that there are only a few natural complexity classes. (v2: version submitted for publication: clarified section 5.3; added references) (v3: added discussion of marginally hospitable vacua. Version to appear in Annals of Physics)Comment: 50 pages, 6 figure
    corecore