64,184 research outputs found

    Brussels-Austin Nonequilibrium Statistical Mechanics in the Early Years: Similarity Transformations between Deterministic and Probabilistic Descriptions

    Get PDF
    The fundamental problem on which Ilya Prigogine and the Brussels-Austin Group have focused can be stated briefly as follows. Our observations indicate that there is an arrow of time in our experience of the world (e.g., decay of unstable radioactive atoms like Uranium, or the mixing of cream in coffee). Most of the fundamental equations of physics are time reversible, however, presenting an apparent conflict between our theoretical descriptions and experimental observations. Many have thought that the observed arrow of time was either an artifact of our observations or due to very special initial conditions. An alternative approach, followed by the Brussels-Austin Group, is to consider the observed direction of time to be a basics physical phenomenon and to develop a mathematical formalism that can describe this direction as being due to the dynamics of physical systems. In part I of this essay, I review and assess an attempt to carry out an approach that received much of their attention from the early 1970s to the mid 1980s. In part II, I will discuss their more recent approach using rigged Hilbert spaces.Comment: 22 pages, Part I of two parts; updated institutional affiliatio

    DROP: Dimensionality Reduction Optimization for Time Series

    Full text link
    Dimensionality reduction is a critical step in scaling machine learning pipelines. Principal component analysis (PCA) is a standard tool for dimensionality reduction, but performing PCA over a full dataset can be prohibitively expensive. As a result, theoretical work has studied the effectiveness of iterative, stochastic PCA methods that operate over data samples. However, termination conditions for stochastic PCA either execute for a predetermined number of iterations, or until convergence of the solution, frequently sampling too many or too few datapoints for end-to-end runtime improvements. We show how accounting for downstream analytics operations during DR via PCA allows stochastic methods to efficiently terminate after operating over small (e.g., 1%) subsamples of input data, reducing whole workload runtime. Leveraging this, we propose DROP, a DR optimizer that enables speedups of up to 5x over Singular-Value-Decomposition-based PCA techniques, and exceeds conventional approaches like FFT and PAA by up to 16x in end-to-end workloads
    • …
    corecore