31 research outputs found

    Reflex and Tonic Autonomic Markers for Risk Stratification in Patients With Type 2 Diabetes Surviving Acute Myocardial Infarction

    Get PDF
    OBJECTIVE Diabetic postinfarction patients are at increased mortality risk compared with nondiabetic postinfarction patients. In a substantial number of these patients, diabetic cardiac neuropathy already preexists at the time of the infarction. In the current study we investigated if markers of autonomic dysfunction can further discriminate diabetic postinfarction patients into low- and high-risk groups. RESEARCH DESIGN AND METHODS We prospectively enrolled 481 patients with type 2 diabetes who survived acute myocardial infarction (MI), were aged ≀80 years, and presented in sinus rhythm. Primary end point was total mortality at 5 years of follow-up. Severe autonomic failure (SAF) was defined as coincidence of abnormal autonomic reflex function (assessed by means of heart rate turbulence) and of abnormal autonomic tonic activity (assessed by means of deceleration capacity of heart rate). Multivariable risk analyses considered SAF and standard risk predictors including history of previous MI, arrhythmia on Holter monitoring, insulin treatment, and impaired left ventricular ejection fraction (LVEF) ≀30%. RESULTS During follow-up, 83 of the 481 patients (17.3%) died. Of these, 24 deaths were sudden cardiac deaths and 21 nonsudden cardiac deaths. SAF identified a high-risk group of 58 patients with a 5-year mortality rate of 64.0% at a sensitivity level of 38.0%. Multivariately, SAF was the strongest predictor of mortality (hazard ratio 4.9 [95% CI 2.4–9.9]), followed by age ≄65 years (3.4 [1.9–5.8]), and LVEF ≀30% (2.6 [1.5–4.4]). CONCLUSIONS Combined abnormalities of autonomic reflex function and autonomic tonic activity identifies diabetic postinfarction patients with very poor prognoses

    H-1 and C-13 NMR study of perdeuterated pyrazoles

    No full text
    The ^{1}H and ^{13}C chemical shifts as well as the ^{1}H–^{2}H and ^{2}H–^{13}C coupling constants of perdeuterated 3,5-dimethylpyrazole and 3,5-diphenylpyrazole have been measured and the values compared with those of the unlabelled compounds.Peer reviewe

    Riparian ecosystems in the 21st century: hotspots for climate\ud change adaptation?

    Get PDF
    Riparian ecosystems in the 21st century are likely to play a critical role in determining the vulnerability of natural and human systems to climate change, and in influencing the capacity of these systems to adapt. Some authors have suggested that riparian ecosystems are particularly vulnerable to climate change impacts due to their high levels of exposure and sensitivity to climatic stimuli, and their history of degradation. Others have highlighted the probable resilience of riparian ecosystems to climate change as a result of their evolution under high levels of climatic and environmental variability. We synthesize current knowledge of the vulnerability of riparian ecosystems to climate change by assessing the potential exposure, sensitivity, and adaptive capacity of their key components and processes, as well as ecosystem functions, goods and services, to projected global climatic changes. We review key pathways for ecological and human adaptation for the maintenance, restoration and enhancement of riparian ecosystem functions, goods and services and present emerging principles for planned adaptation. Our synthesis suggests that, in the absence of adaptation, riparian ecosystems are likely to be highly vulnerable to climate change impacts. However, given the critical role of riparian ecosystem functions in landscapes, as well as the strong links between riparian ecosystems and human well-being, considerable means, motives and opportunities for strategically planned adaptation to climate change also exist. The need for planned adaptation of and for riparian ecosystems is likely to be strengthened as the importance of many riparian ecosystem functions, goods and services will grow under a changing climate. Consequently, riparian ecosystems are likely to become adaptation 'hotspots' as the century unfolds

    Deep Underground Neutrino Experiment (DUNE) Near Detector Conceptual Design Report

    No full text
    International audienceThe Deep Underground Neutrino Experiment (DUNE) is an international, world-class experiment aimed at exploring fundamental questions about the universe that are at the forefront of astrophysics and particle physics research. DUNE will study questions pertaining to the preponderance of matter over antimatter in the early universe, the dynamics of supernovae, the subtleties of neutrino interaction physics, and a number of beyond the Standard Model topics accessible in a powerful neutrino beam. A critical component of the DUNE physics program involves the study of changes in a powerful beam of neutrinos, i.e., neutrino oscillations, as the neutrinos propagate a long distance. The experiment consists of a near detector, sited close to the source of the beam, and a far detector, sited along the beam at a large distance. This document, the DUNE Near Detector Conceptual Design Report (CDR), describes the design of the DUNE near detector and the science program that drives the design and technology choices. The goals and requirements underlying the design, along with projected performance are given. It serves as a starting point for a more detailed design that will be described in future documents

    Scintillation light detection in the 6-m drift-length ProtoDUNE Dual Phase liquid argon TPC

    No full text
    DUNE is a dual-site experiment for long-baseline neutrino oscillation studies, neutrino astrophysics and nucleon decay searches. ProtoDUNE Dual Phase (DP) is a 6 ×\times  6 ×\times  6 m3^3 liquid argon time-projection-chamber (LArTPC) that recorded cosmic-muon data at the CERN Neutrino Platform in 2019–2020 as a prototype of the DUNE Far Detector. Charged particles propagating through the LArTPC produce ionization and scintillation light. The scintillation light signal in these detectors can provide the trigger for non-beam events. In addition, it adds precise timing capabilities and improves the calorimetry measurements. In ProtoDUNE-DP, scintillation and electroluminescence light produced by cosmic muons in the LArTPC is collected by photomultiplier tubes placed up to 7 m away from the ionizing track. In this paper, the ProtoDUNE-DP photon detection system performance is evaluated with a particular focus on the different wavelength shifters, such as PEN and TPB, and the use of Xe-doped LAr, considering its future use in giant LArTPCs. The scintillation light production and propagation processes are analyzed and a comparison of simulation to data is performed, improving understanding of the liquid argon properties.DUNE is a dual-site experiment for long-baseline neutrino oscillation studies, neutrino astrophysics and nucleon decay searches. ProtoDUNE Dual Phase (DP) is a 6x6x6m3 liquid argon time-projection-chamber (LArTPC) that recorded cosmic-muon data at the CERN Neutrino Platform in 2019-2020 as a prototype of the DUNE Far Detector. Charged particles propagating through the LArTPC produce ionization and scintillation light. The scintillation light signal in these detectors can provide the trigger for non-beam events. In addition, it adds precise timing capabilities and improves the calorimetry measurements. In ProtoDUNE-DP, scintillation and electroluminescence light produced by cosmic muons in the LArTPC is collected by photomultiplier tubes placed up to 7 m away from the ionizing track. In this paper, the ProtoDUNE-DP photon detection system performance is evaluated with a particular focus on the different wavelength shifters, such as PEN and TPB, and the use of Xe-doped LAr, considering its future use in giant LArTPCs. The scintillation light production and propagation processes are analyzed and a comparison of simulation to data is performed, improving understanding of the liquid argon properties

    DUNE Offline Computing Conceptual Design Report

    No full text
    International audienceThis document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    The DUNE Far Detector Vertical Drift Technology, Technical Design Report

    No full text
    International audienceDUNE is an international experiment dedicated to addressing some of the questions at the forefront of particle physics and astrophysics, including the mystifying preponderance of matter over antimatter in the early universe. The dual-site experiment will employ an intense neutrino beam focused on a near and a far detector as it aims to determine the neutrino mass hierarchy and to make high-precision measurements of the PMNS matrix parameters, including the CP-violating phase. It will also stand ready to observe supernova neutrino bursts, and seeks to observe nucleon decay as a signature of a grand unified theory underlying the standard model. The DUNE far detector implements liquid argon time-projection chamber (LArTPC) technology, and combines the many tens-of-kiloton fiducial mass necessary for rare event searches with the sub-centimeter spatial resolution required to image those events with high precision. The addition of a photon detection system enhances physics capabilities for all DUNE physics drivers and opens prospects for further physics explorations. Given its size, the far detector will be implemented as a set of modules, with LArTPC designs that differ from one another as newer technologies arise. In the vertical drift LArTPC design, a horizontal cathode bisects the detector, creating two stacked drift volumes in which ionization charges drift towards anodes at either the top or bottom. The anodes are composed of perforated PCB layers with conductive strips, enabling reconstruction in 3D. Light-trap-style photon detection modules are placed both on the cryostat's side walls and on the central cathode where they are optically powered. This Technical Design Report describes in detail the technical implementations of each subsystem of this LArTPC that, together with the other far detector modules and the near detector, will enable DUNE to achieve its physics goals

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora

    No full text
    International audienceThe Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/cc charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1±0.6\pm0.6% and 84.1±0.6\pm0.6%, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation

    Separation of track- and shower-like energy deposits in ProtoDUNE-SP using a convolutional neural network

    No full text
    International audienceLiquid argon time projection chamber detector technology provides high spatial and calorimetric resolutions on the charged particles traversing liquid argon. As a result, the technology has been used in a number of recent neutrino experiments, and is the technology of choice for the Deep Underground Neutrino Experiment (DUNE). In order to perform high precision measurements of neutrinos in the detector, final state particles need to be effectively identified, and their energy accurately reconstructed. This article proposes an algorithm based on a convolutional neural network to perform the classification of energy deposits and reconstructed particles as track-like or arising from electromagnetic cascades. Results from testing the algorithm on experimental data from ProtoDUNE-SP, a prototype of the DUNE far detector, are presented. The network identifies track- and shower-like particles, as well as Michel electrons, with high efficiency. The performance of the algorithm is consistent between experimental data and simulation

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    No full text
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype
    corecore