1,343 research outputs found

    Data mining: a tool for detecting cyclical disturbances in supply networks.

    Get PDF
    Disturbances in supply chains may be either exogenous or endogenous. The ability automatically to detect, diagnose, and distinguish between the causes of disturbances is of prime importance to decision makers in order to avoid uncertainty. The spectral principal component analysis (SPCA) technique has been utilized to distinguish between real and rogue disturbances in a steel supply network. The data set used was collected from four different business units in the network and consists of 43 variables; each is described by 72 data points. The present paper will utilize the same data set to test an alternative approach to SPCA in detecting the disturbances. The new approach employs statistical data pre-processing, clustering, and classification learning techniques to analyse the supply network data. In particular, the incremental k-means clustering and the RULES-6 classification rule-learning algorithms, developed by the present authors’ team, have been applied to identify important patterns in the data set. Results show that the proposed approach has the capability automatically to detect and characterize network-wide cyclical disturbances and generate hypotheses about their root cause

    Extracting quantum dynamics from genetic learning algorithms through principal control analysis

    Full text link
    Genetic learning algorithms are widely used to control ultrafast optical pulse shapes for photo-induced quantum control of atoms and molecules. An unresolved issue is how to use the solutions found by these algorithms to learn about the system's quantum dynamics. We propose a simple method based on covariance analysis of the control space, which can reveal the degrees of freedom in the effective control Hamiltonian. We have applied this technique to stimulated Raman scattering in liquid methanol. A simple model of two-mode stimulated Raman scattering is consistent with the results.Comment: 4 pages, 5 figures. Presented at coherent control Ringberg conference 200

    Space-Time Clustering and Correlations of Major Earthquakes

    Get PDF
    Earthquake occurrence in nature is thought to result from correlated elastic stresses, leading to clustering in space and time. We show that occurrence of major earthquakes in California correlates with time intervals when fluctuations in small earthquakes are suppressed relative to the long term average. We estimate a probability of less than 1% that this coincidence is due to random clustering.Comment: 5 pages, 3 figures. Submitted to PR

    Ultrafast ring opening in CHD investigated by simplex-based spectral unmixing

    Full text link
    We use spectral unmixing to determine the number of transient photoproducts and to track their evolution following the photo- excitation of 1,3-cyclohexadiene (CHD) to form 1,3,5-hexatriene (HT) in the gas phase. The ring opening is initiated with a 266 nm ultraviolet laser pulse and probed via fragmentation with a delayed intense infrared 800 nm laser pulse. The ion time-of-flight (TOF) spectra are analyzed with a simplex-based spectral unmixing technique. We find that at least three independent spectra are needed to model the transient TOF spectra. Guided by mathematical and physical constraints, we decompose the transient TOF spectra into three spectra associated with the presence of CHD, CHD+, and HT, and show how these three products appear at different times during the ring opening

    Data-adaptive harmonic spectra and multilayer Stuart-Landau models

    Full text link
    Harmonic decompositions of multivariate time series are considered for which we adopt an integral operator approach with periodic semigroup kernels. Spectral decomposition theorems are derived that cover the important cases of two-time statistics drawn from a mixing invariant measure. The corresponding eigenvalues can be grouped per Fourier frequency, and are actually given, at each frequency, as the singular values of a cross-spectral matrix depending on the data. These eigenvalues obey furthermore a variational principle that allows us to define naturally a multidimensional power spectrum. The eigenmodes, as far as they are concerned, exhibit a data-adaptive character manifested in their phase which allows us in turn to define a multidimensional phase spectrum. The resulting data-adaptive harmonic (DAH) modes allow for reducing the data-driven modeling effort to elemental models stacked per frequency, only coupled at different frequencies by the same noise realization. In particular, the DAH decomposition extracts time-dependent coefficients stacked by Fourier frequency which can be efficiently modeled---provided the decay of temporal correlations is sufficiently well-resolved---within a class of multilayer stochastic models (MSMs) tailored here on stochastic Stuart-Landau oscillators. Applications to the Lorenz 96 model and to a stochastic heat equation driven by a space-time white noise, are considered. In both cases, the DAH decomposition allows for an extraction of spatio-temporal modes revealing key features of the dynamics in the embedded phase space. The multilayer Stuart-Landau models (MSLMs) are shown to successfully model the typical patterns of the corresponding time-evolving fields, as well as their statistics of occurrence.Comment: 26 pages, double columns; 15 figure

    Timing and Dose of Upper Limb Motor Intervention After Stroke: A Systematic Review

    Get PDF
    This systematic review aimed to investigate timing, dose, and efficacy of upper limb intervention during the first 6 months poststroke. Three online databases were searched up to July 2020. Titles/abstracts/full-text were reviewed independently by 2 authors. Randomized and nonrandomized studies that enrolled people within the first 6 months poststroke, aimed to improve upper limb recovery, and completed preintervention and postintervention assessments were included. Risk of bias was assessed using Cochrane reporting tools. Studies were examined by timing (recovery epoch), dose, and intervention type. Two hundred and sixty-one studies were included, representing 228 (n=9704 participants) unique data sets. The number of studies completed increased from one (n=37 participants) between 1980 and 1984 to 91 (n=4417 participants) between 2015 and 2019. Timing of intervention start has not changed (median 38 days, interquartile range [IQR], 22–66) and study sample size remains small (median n=30, IQR 20–48). Most studies were rated high risk of bias (62%). Study participants were enrolled at different recovery epochs: 1 hyperacute (<24 hours), 13 acute (1–7 days), 176 early subacute (8–90 days), 34 late subacute (91–180 days), and 4 were unable to be classified to an epoch. For both the intervention and control groups, the median dose was 45 (IQR, 600–1430) min/session, 1 (IQR, 1–1) session/d, 5 (IQR, 5–5) d/wk for 4 (IQR, 3–5) weeks. The most common interventions tested were electromechanical (n=55 studies), electrical stimulation (n=38 studies), and constraint-induced movement (n=28 studies) therapies. Despite a large and growing body of research, intervention dose and sample size of included studies were often too small to detect clinically important effects. Furthermore, interventions remain focused on subacute stroke recovery with little change in recent decades. A united research agenda that establishes a clear biological understanding of timing, dose, and intervention type is needed to progress stroke recovery research. Prospective Register of Systematic Reviews ID: CRD42018019367/CRD42018111629

    Nonlinear Mode Decomposition: a new noise-robust, adaptive decomposition method

    Get PDF
    We introduce a new adaptive decomposition tool, which we refer to as Nonlinear Mode Decomposition (NMD). It decomposes a given signal into a set of physically meaningful oscillations for any waveform, simultaneously removing the noise. NMD is based on the powerful combination of time-frequency analysis techniques - which together with the adaptive choice of their parameters make it extremely noise-robust - and surrogate data tests, used to identify interdependent oscillations and to distinguish deterministic from random activity. We illustrate the application of NMD to both simulated and real signals, and demonstrate its qualitative and quantitative superiority over the other existing approaches, such as (ensemble) empirical mode decomposition, Karhunen-Loeve expansion and independent component analysis. We point out that NMD is likely to be applicable and useful in many different areas of research, such as geophysics, finance, and the life sciences. The necessary MATLAB codes for running NMD are freely available at http://www.physics.lancs.ac.uk/research/nbmphysics/diats/nmd/.Comment: 38 pages, 13 figure

    Representing complex data using localized principal components with application to astronomical data

    Full text link
    Often the relation between the variables constituting a multivariate data space might be characterized by one or more of the terms: ``nonlinear'', ``branched'', ``disconnected'', ``bended'', ``curved'', ``heterogeneous'', or, more general, ``complex''. In these cases, simple principal component analysis (PCA) as a tool for dimension reduction can fail badly. Of the many alternative approaches proposed so far, local approximations of PCA are among the most promising. This paper will give a short review of localized versions of PCA, focusing on local principal curves and local partitioning algorithms. Furthermore we discuss projections other than the local principal components. When performing local dimension reduction for regression or classification problems it is important to focus not only on the manifold structure of the covariates, but also on the response variable(s). Local principal components only achieve the former, whereas localized regression approaches concentrate on the latter. Local projection directions derived from the partial least squares (PLS) algorithm offer an interesting trade-off between these two objectives. We apply these methods to several real data sets. In particular, we consider simulated astrophysical data from the future Galactic survey mission Gaia.Comment: 25 pages. In "Principal Manifolds for Data Visualization and Dimension Reduction", A. Gorban, B. Kegl, D. Wunsch, and A. Zinovyev (eds), Lecture Notes in Computational Science and Engineering, Springer, 2007, pp. 180--204, http://www.springer.com/dal/home/generic/search/results?SGWID=1-40109-22-173750210-
    • …
    corecore