2,110 research outputs found

    Depinning and dynamics of AC driven vortex lattices in random media

    Get PDF
    We study the different dynamical regimes of a vortex lattice driven by AC forces in the presence of random pinning via numerical simulations. The behaviour of the different observables is charaterized as a function of the applied force amplitude for different frequencies. We discuss the inconveniences of using the mean velocity to identify the depinnig transition and we show that instead, the mean quadratic displacement of the lattice is the relevant magnitude to characterize different AC regimes. We discuss how the results depend on the initial configuration and we identify new hysteretic effects which are absent in the DC driven systems.Comment: 6 pages, 4 figure

    Temporal Correlations and Persistence in the Kinetic Ising Model: the Role of Temperature

    Full text link
    We study the statistical properties of the sum St=0tdtσtS_t=\int_{0}^{t}dt' \sigma_{t'}, that is the difference of time spent positive or negative by the spin σt\sigma_{t}, located at a given site of a DD-dimensional Ising model evolving under Glauber dynamics from a random initial configuration. We investigate the distribution of StS_{t} and the first-passage statistics (persistence) of this quantity. We discuss successively the three regimes of high temperature (T>TcT>T_{c}), criticality (T=TcT=T_c), and low temperature (T<TcT<T_{c}). We discuss in particular the question of the temperature dependence of the persistence exponent θ\theta, as well as that of the spectrum of exponents θ(x)\theta(x), in the low temperature phase. The probability that the temporal mean St/tS_t/t was always larger than the equilibrium magnetization is found to decay as tθ12t^{-\theta-\frac12}. This yields a numerical determination of the persistence exponent θ\theta in the whole low temperature phase, in two dimensions, and above the roughening transition, in the low-temperature phase of the three-dimensional Ising model.Comment: 21 pages, 11 PostScript figures included (1 color figure

    A method for approximating optimal statistical significances with machine-learned likelihoods

    Get PDF
    Machine-learning techniques have become fundamental in high-energy physics and, for new physics searches, it is crucial to know their performance in terms of experimental sensitivity, understood as the statistical significance of the signal-plus-background hypothesis over the background-only one. We present here a simple method that combines the power of current machine-learning techniques to face high-dimensional data with the likelihood-based inference tests used in traditional analyses, which allows us to estimate the sensitivity for both discovery and exclusion limits through a single parameter of interest, the signal strength. Based on supervised learning techniques, it can perform well also with high-dimensional data, when traditional techniques cannot. We apply the method to a toy model first, so we can explore its potential, and then to a LHC study of new physics particles in dijet final states. Considering as the optimal statistical significance the one we would obtain if the true generative functions were known, we show that our method provides a better approximation than the usual naive counting experimental results.Comment: 24 pages, 8 figures; matches version published in Eur. Phys. J.

    Bluetooth Mesh Analysis, Issues, and Challenges

    Get PDF
    BLE is a widely used short-range technology which has gained a relevant position inside the Internet-of-Things (IoT) paradigm development thanks to its simplicity, low-power consumption, low-cost and robustness. New enhancements over BLE have focused on supporting mesh network topology. Compared to other mesh networks, BLE mesh has only considered a managed flooding protocol in its first version. Managed flooding may generally seem inefficient in many contexts, but it is a high desirable option when data transmission is urgent, the network is small or its configuration changes in a very dynamic way. Knowing the interest to many application contexts, this paper analyses the impact of tweaking several features over the reliability and efficiency of the mesh network. These features are configured and controlled in different layers: message repetition schemes, the transmission randomization, the election of a scheme based on an acknowledged or unacknowledged transmission, etc. In order to estimate the real performance of a mesh network deployment, this paper evaluates the effects of the interaction of the chosen parameters, their appropriate adjustment in relation with the characteristics of real implementations and the true overhead related to the whole protocol stack. The paper identifies configuration challenges, proposes network tuning criteria and outlines possible standard improvements. For this purpose, a detailed assessment on the implementation and execution of real devices has been performed with their chipset limitations

    Training machine learning models with synthetic data improves the prediction of ventricular origin in outflow tract ventricular arrhythmias

    Get PDF
    In order to determine the site of origin (SOO) in outflow tract ventricular arrhythmias (OTVAs) before an ablation procedure, several algorithms based on manual identification of electrocardiogram (ECG) features, have been developed. However, the reported accuracy decreases when tested with different datasets. Machine learning algorithms can automatize the process and improve generalization, but their performance is hampered by the lack of large enough OTVA databases. We propose the use of detailed electrophysiological simulations of OTVAs to train a machine learning classification model to predict the ventricular origin of the SOO of ectopic beats. We generated a synthetic database of 12-lead ECGs (2,496 signals) by running multiple simulations from the most typical OTVA SOO in 16 patient-specific geometries. Two types of input data were considered in the classification, raw and feature ECG signals. From the simulated raw 12-lead ECG, we analyzed the contribution of each lead in the predictions, keeping the best ones for the training process. For feature-based analysis, we used entropy-based methods to rank the obtained features. A cross-validation process was included to evaluate the machine learning model. Following, two clinical OTVA databases from different hospitals, including ECGs from 365 patients, were used as test-sets to assess the generalization of the proposed approach. The results show that V2 was the best lead for classification. Prediction of the SOO in OTVA, using both raw signals or features for classification, presented high accuracy values (>0.96). Generalization of the network trained on simulated data was good for both patient datasets (accuracy of 0.86 and 0.84, respectively) and presented better values than using exclusively real ECGs for classification (accuracy of 0.84 and 0.76 for each dataset). The use of simulated ECG data for training machine learning-based classification algorithms is critical to obtain good SOO predictions in OTVA compared to real data alone. The fast implementation and generalization of the proposed methodology may contribute towards its application to a clinical routine.Copyright © 2022 Doste, Lozano, Jimenez-Perez, Mont, Berruezo, Penela, Camara and Sebastian

    Training machine learning models with synthetic data improves the prediction of ventricular origin in outflow tract ventricular arrhythmias

    Get PDF
    In order to determine the site of origin (SOO) in outflow tract ventricular arrhythmias (OTVAs) before an ablation procedure, several algorithms based on manual identification of electrocardiogram (ECG) features, have been developed. However, the reported accuracy decreases when tested with different datasets. Machine learning algorithms can automatize the process and improve generalization, but their performance is hampered by the lack of large enough OTVA databases. We propose the use of detailed electrophysiological simulations of OTVAs to train a machine learning classification model to predict the ventricular origin of the SOO of ectopic beats. We generated a synthetic database of 12-lead ECGs (2,496 signals) by running multiple simulations from the most typical OTVA SOO in 16 patient-specific geometries. Two types of input data were considered in the classification, raw and feature ECG signals. From the simulated raw 12-lead ECG, we analyzed the contribution of each lead in the predictions, keeping the best ones for the training process. For feature-based analysis, we used entropy-based methods to rank the obtained features. A cross-validation process was included to evaluate the machine learning model. Following, two clinical OTVA databases from different hospitals, including ECGs from 365 patients, were used as test-sets to assess the generalization of the proposed approach. The results show that V2 was the best lead for classification. Prediction of the SOO in OTVA, using both raw signals or features for classification, presented high accuracy values (>0.96). Generalization of the network trained on simulated data was good for both patient datasets (accuracy of 0.86 and 0.84, respectively) and presented better values than using exclusively real ECGs for classification (accuracy of 0.84 and 0.76 for each dataset). The use of simulated ECG data for training machine learning-based classification algorithms is critical to obtain good SOO predictions in OTVA compared to real data alone. The fast implementation and generalization of the proposed methodology may contribute towards its application to a clinical routine

    Navigation of Distinct Euclidean Particles via Hierarchical Clustering

    Get PDF
    We present a centralized online (completely reactive) hybrid navigation algorithm for bringing a swarm of n perfectly sensed and actuated point particles in Euclidean d space (for arbitrary n and d) to an arbitrary goal configuration with the guarantee of no collisions along the way. Our construction entails a discrete abstraction of configurations using cluster hierarchies, and relies upon two prior recent constructions: (i) a family of hierarchy-preserving control policies and (ii) an abstract discrete dynamical system for navigating through the space of cluster hierarchies. Here, we relate the (combinatorial) topology of hierarchical clusters to the (continuous) topology of configurations by constructing “portals” — open sets of configurations supporting two adjacent hierarchies. The resulting online sequential composition of hierarchy-invariant swarming followed by discrete selection of a hierarchy “closer” to that of the destination along with its continuous instantiation via an appropriate portal configuration yields a computationally effective construction for the desired navigation policy

    A semantic interoperability approach to support integration of gene expression and clinical data in breast cancer

    Get PDF
    [Abstract] Introduction. The introduction of omics data and advances in technologies involved in clinical treatment has led to a broad range of approaches to represent clinical information. Within this context, patient stratification across health institutions due to omic profiling presents a complex scenario to carry out multi-center clinical trials. Methods. This paper presents a standards-based approach to ensure semantic integration required to facilitate the analysis of clinico-genomic clinical trials. To ensure interoperability across different institutions, we have developed a Semantic Interoperability Layer (SIL) to facilitate homogeneous access to clinical and genetic information, based on different well-established biomedical standards and following International Health (IHE) recommendations. Results. The SIL has shown suitability for integrating biomedical knowledge and technologies to match the latest clinical advances in healthcare and the use of genomic information. This genomic data integration in the SIL has been tested with a diagnostic classifier tool that takes advantage of harmonized multi-center clinico-genomic data for training statistical predictive models. Conclusions. The SIL has been adopted in national and international research initiatives, such as the EURECA-EU research project and the CIMED collaborative Spanish project, where the proposed solution has been applied and evaluated by clinical experts focused on clinico-genomic studies.Instituto de Salud Carlos III, PI13/02020Instituto de Salud Carlos III, PI13/0028

    Managing dose-, damage- and data-rates in multi-frame spectrum-imaging

    Get PDF
    As an instrument, the scanning transmission electron microscope is unique in being able to simultaneously explore both local structural and chemical variations in materials at the atomic scale. This is made possible as both types of data are acquired serially, originating simultaneously from sample interactions with a sharply focused electron probe. Unfortunately, such scanned data can be distorted by environmental factors, though recently fast-scanned multi-frame imaging approaches have been shown to mitigate these effects. Here, we demonstrate the same approach but optimized for spectroscopic data; we offer some perspectives on the new potential of multi-frame spectrum-imaging (MFSI) and show how dose-sharing approaches can reduce sample damage, improve crystallographic fidelity, increase data signal-to-noise, or maximize usable field of view. Further, we discuss the potential issue of excessive data-rates in MFSI, and demonstrate a file-compression approach to significantly reduce data storage and transmission burdens
    corecore