726 research outputs found

    Lempel-Ziv Complexity Analysis for the Evaluation of Atrial Fibrillation Organization

    Get PDF
    The Lempel-Ziv (LZ) complexity is a non-linear time series analysis metric that reflects the arising rate of new patterns along with the sequence. Thus, it captures its temporal sequence and, quite conveniently, it can be computed with short data segments. In the present work, a detailed analysis on LZ complexity is presented within the context of atrial fibrillation (AF) organization estimation. As the analysed time series depend on the original sampling rate (fs), we evaluated the relationship between LZ complexity and fs. Furthermore, different implementations of LZ complexity were tested. Our results show the usefulness of LZ complexity to estimate AF organization and suggest that the signals from a terminating paroxysmal AF group are more organized (i.e. less complex) than those from the non-terminating paroxysmal AF group. However, the diagnostic accuracy was not as high as that obtained with sample entropy (SampEn), another non-linear metric, with the same database in a previous study (92% vs. 96%). Nevertheless, the LZ complexity analysis of AF organization with sampling frequencies higher than 2048 Hz, or even its combination with SampEn or other non-linear metrics, might improve the prediction of spontaneous AF termination

    Optimized assessment of atrial fibrillation organization through suitable parameters of Sample Entropy

    Get PDF
    Sample Entropy (SampEn) is a nonlinear regularity index that requires the a priori selection of three parameters: the length of the sequences to be compared, m, the patterns similarity tolerance, r, and the number of samples under analysis, N. Appropriate values for m, r and N have been recommended in some cases, such as heart rate, hormonal data, etc., but no guidelines exist for the selection of that values. Hence, an optimal parameters study should be required for the application of SampEn to not previously analyzed biomedical signals. In this work, a thorough analysis on the optimal SampEn parameter values within two different scenarios of AF organization estimation, such as the prediction of paroxysmal AF termination and the electrical cardioversion outcome in persistent AF, is presented. Results indicated that, (i) the proportion between N and the sampling rate (ƒ(s)) should be higher than one second and ƒ(s) ≥ 256 Hz, (ii) overlapping between adjacent N-length windows does not improve organization estimation and (iii) values of m and r maximizing classification should be considered within a range wider than the proposed in the literature for heart rate analysis, i. e. m = 1 and m = 2 and r between 0.1 and 0.25 times the standard deviation of the data

    Cosmic Ray Anomalies from the MSSM?

    Get PDF
    The recent positron excess in cosmic rays (CR) observed by the PAMELA satellite may be a signal for dark matter (DM) annihilation. When these measurements are combined with those from FERMI on the total (e++ee^++e^-) flux and from PAMELA itself on the pˉ/p\bar p/p ratio, these and other results are difficult to reconcile with traditional models of DM, including the conventional mSUGRA version of Supersymmetry even if boosts as large as 103410^{3-4} are allowed. In this paper, we combine the results of a previously obtained scan over a more general 19-parameter subspace of the MSSM with a corresponding scan over astrophysical parameters that describe the propagation of CR. We then ascertain whether or not a good fit to this CR data can be obtained with relatively small boost factors while simultaneously satisfying the additional constraints arising from gamma ray data. We find that a specific subclass of MSSM models where the LSP is mostly pure bino and annihilates almost exclusively into τ\tau pairs comes very close to satisfying these requirements. The lightest τ~\tilde \tau in this set of models is found to be relatively close in mass to the LSP and is in some cases the nLSP. These models lead to a significant improvement in the overall fit to the data by an amount Δχ21/\Delta \chi^2 \sim 1/dof in comparison to the best fit without Supersymmetry while employing boosts 100\sim 100. The implications of these models for future experiments are discussed.Comment: 57 pages, 31 figures, references adde

    Haiku - a Scala combinator toolkit for semi-automated composition of metaheuristics

    Get PDF
    There is an emerging trend towards the automated design of metaheuristics at the software component level. In principle, metaheuristics have a relatively clean decomposition, where well-known frameworks such as ILS and EA are parametrised by variant components for acceptance, perturbation etc. Automated generation of these frameworks is not so simple in practice, since the coupling between components may be implementation specific. Compositionality is the ability to freely express a space of designs ‘bottom up’ in terms of elementary components: previous work in this area has used combinators, a modular and functional approach to componentisation arising from foundational Computer Science. In this article, we describeHaiku, a combinator tool-kit written in the Scala language, which builds upon previous work to further automate the process by automatically composing the external dependencies of components. We provide examples of use and give a case study in which a programatically-generated heuristic is applied to the Travelling Salesman Problem within an Evolutionary Strategies framework

    Relative entanglement entropies in 1 + 1-dimensional conformal field theories

    Get PDF
    We study the relative entanglement entropies of one interval between excited states of a 1+1 dimensional conformal field theory (CFT). To compute the relative entropy S(\u3c11\u2016\u3c10) between two given reduced density matrices \u3c11 and \u3c10 of a quantum field theory, we employ the replica trick which relies on the path integral representation of Tr(\u3c11\u3c1n 1210) and define a set of R\'enyi relative entropies Sn(\u3c11\u2016\u3c10). We compute these quantities for integer values of the parameter n and derive via the replica limit, the relative entropy between excited states generated by primary fields of a free massless bosonic field. In particular, we provide the relative entanglement entropy of the state described by the primary operator i 02\u3d5, both with respect to the ground state and to the state generated by chiral vertex operators. These predictions are tested against exact numerical calculations in the XX spin-chain finding perfect agreement. \ua9 2017, The Author(s)

    Search for Charged Higgs Bosons in e+e- Collisions at \sqrt{s} = 189 GeV

    Full text link
    A search for pair-produced charged Higgs bosons is performed with the L3 detector at LEP using data collected at a centre-of-mass energy of 188.6 GeV, corresponding to an integrated luminosity of 176.4 pb^-1. Higgs decays into a charm and a strange quark or into a tau lepton and its associated neutrino are considered. The observed events are consistent with the expectations from Standard Model background processes. A lower limit of 65.5 GeV on the charged Higgs mass is derived at 95 % confidence level, independent of the decay branching ratio Br(H^{+/-} -> tau nu)

    Search for the standard model Higgs boson at LEP

    Get PDF

    Azimuthal anisotropy of charged particles at high transverse momenta in PbPb collisions at sqrt(s[NN]) = 2.76 TeV

    Get PDF
    The azimuthal anisotropy of charged particles in PbPb collisions at nucleon-nucleon center-of-mass energy of 2.76 TeV is measured with the CMS detector at the LHC over an extended transverse momentum (pt) range up to approximately 60 GeV. The data cover both the low-pt region associated with hydrodynamic flow phenomena and the high-pt region where the anisotropies may reflect the path-length dependence of parton energy loss in the created medium. The anisotropy parameter (v2) of the particles is extracted by correlating charged tracks with respect to the event-plane reconstructed by using the energy deposited in forward-angle calorimeters. For the six bins of collision centrality studied, spanning the range of 0-60% most-central events, the observed v2 values are found to first increase with pt, reaching a maximum around pt = 3 GeV, and then to gradually decrease to almost zero, with the decline persisting up to at least pt = 40 GeV over the full centrality range measured.Comment: Replaced with published version. Added journal reference and DO

    Search for new physics with same-sign isolated dilepton events with jets and missing transverse energy

    Get PDF
    A search for new physics is performed in events with two same-sign isolated leptons, hadronic jets, and missing transverse energy in the final state. The analysis is based on a data sample corresponding to an integrated luminosity of 4.98 inverse femtobarns produced in pp collisions at a center-of-mass energy of 7 TeV collected by the CMS experiment at the LHC. This constitutes a factor of 140 increase in integrated luminosity over previously published results. The observed yields agree with the standard model predictions and thus no evidence for new physics is found. The observations are used to set upper limits on possible new physics contributions and to constrain supersymmetric models. To facilitate the interpretation of the data in a broader range of new physics scenarios, information on the event selection, detector response, and efficiencies is provided.Comment: Published in Physical Review Letter

    Compressed representation of a partially defined integer function over multiple arguments

    Get PDF
    In OLAP (OnLine Analitical Processing) data are analysed in an n-dimensional cube. The cube may be represented as a partially defined function over n arguments. Considering that often the function is not defined everywhere, we ask: is there a known way of representing the function or the points in which it is defined, in a more compact manner than the trivial one
    corecore