51 research outputs found

    Culture-negative early-onset neonatal sepsis - at the crossroad between efficient sepsis care and antimicrobial stewardship

    Get PDF
    Sepsis is a leading cause of mortality and morbidity in neonates. Presenting clinical symptoms are unspecific. Sensitivity and positive predictive value of biomarkers at onset of symptoms are suboptimal. Clinical suspicion therefore frequently leads to empirical antibiotic therapy in uninfected infants. The incidence of culture confirmed early-onset sepsis is rather low, around 0.4-0.8/1000 term infants in high-income countries. Six to 16 times more infants receive therapy for culture-negative sepsis in the absence of a positive blood culture. Thus, culture-negative sepsis contributes to high antibiotic consumption in neonatal units. Antibiotics may be life-saving for the few infants who are truly infected. However, overuse of broad-spectrum antibiotics increases colonization with antibiotic resistant bacteria. Antibiotic therapy also induces perturbations of the non-resilient early life microbiota with potentially long lasting negative impact on the individual's own health. Currently there is no uniform consensus definition for neonatal sepsis. This leads to variations in management. Two factors may reduce the number of culture-negative sepsis cases. First, obtaining adequate blood cultures (0.5-1 mL) at symptom onset is mandatory. Unless there is a strong clinical or biochemical indication to prolong antibiotics physician need to trust the culture results and to stop antibiotics for suspected sepsis within 36-48 h. Secondly, an international robust and pragmatic neonatal sepsis definition is urgently needed. Neonatal sepsis is a dynamic condition. Rigorous evaluation of clinical symptoms ("organ dysfunction") over 36-48 h in combination with appropriately selected biomarkers ("dysregulated host response") may be used to support or refute a sepsis diagnosis

    Global carbon budget 2013

    Get PDF
    Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates, consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil-fuel combustion and cement production (EFF) are based on energy statistics, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated for the first time in this budget with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models forced by observed climate, CO2 and land cover change (some including nitrogen–carbon interactions). All uncertainties are reported as ±1σ, reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. For the last decade available (2003–2012), EFF was 8.6 ± 0.4 GtC yr−1, ELUC 0.9 ± 0.5 GtC yr−1, GATM 4.3 ± 0.1 GtC yr−1, SOCEAN 2.5 ± 0.5 GtC yr−1, and SLAND 2.8 ± 0.8 GtC yr−1. For year 2012 alone, EFF grew to 9.7 ± 0.5 GtC yr−1, 2.2% above 2011, reflecting a continued growing trend in these emissions, GATM was 5.1 ± 0.2 GtC yr−1, SOCEAN was 2.9 ± 0.5 GtC yr−1, and assuming an ELUC of 1.0 ± 0.5 GtC yr−1 (based on the 2001–2010 average), SLAND was 2.7 ± 0.9 GtC yr−1. GATM was high in 2012 compared to the 2003–2012 average, almost entirely reflecting the high EFF. The global atmospheric CO2 concentration reached 392.52 ± 0.10 ppm averaged over 2012. We estimate that EFF will increase by 2.1% (1.1–3.1%) to 9.9 ± 0.5 GtC in 2013, 61% above emissions in 1990, based on projections of world gross domestic product and recent changes in the carbon intensity of the economy. With this projection, cumulative emissions of CO2 will reach about 535 ± 55 GtC for 1870–2013, about 70% from EFF (390 ± 20 GtC) and 30% from ELUC (145 ± 50 GtC)

    Identification and reconstruction of low-energy electrons in the ProtoDUNE-SP detector

    Get PDF
    International audienceMeasurements of electrons from νe interactions are crucial for the Deep Underground Neutrino Experiment (DUNE) neutrino oscillation program, as well as searches for physics beyond the standard model, supernova neutrino detection, and solar neutrino measurements. This article describes the selection and reconstruction of low-energy (Michel) electrons in the ProtoDUNE-SP detector. ProtoDUNE-SP is one of the prototypes for the DUNE far detector, built and operated at CERN as a charged particle test beam experiment. A sample of low-energy electrons produced by the decay of cosmic muons is selected with a purity of 95%. This sample is used to calibrate the low-energy electron energy scale with two techniques. An electron energy calibration based on a cosmic ray muon sample uses calibration constants derived from measured and simulated cosmic ray muon events. Another calibration technique makes use of the theoretically well-understood Michel electron energy spectrum to convert reconstructed charge to electron energy. In addition, the effects of detector response to low-energy electron energy scale and its resolution including readout electronics threshold effects are quantified. Finally, the relation between the theoretical and reconstructed low-energy electron energy spectrum is derived and the energy resolution is characterized. The low-energy electron selection presented here accounts for about 75% of the total electron deposited energy. After the addition of missing energy using a Monte Carlo simulation, the energy resolution improves from about 40% to 25% at 50 MeV. These results are used to validate the expected capabilities of the DUNE far detector to reconstruct low-energy electrons

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    A low-voltage glow discharge tube

    Full text link
    corecore