164 research outputs found

    Evolution of the use of corticosteroids for the treatment of hospitalised COVID-19 patients in Spain between March and November 2020: SEMI-COVID national registry

    Get PDF
    Objectives: Since the results of the RECOVERY trial, WHO recommendations about the use of corticosteroids (CTs) in COVID-19 have changed. The aim of the study is to analyse the evolutive use of CTs in Spain during the pandemic to assess the potential influence of new recommendations. Material and methods: A retrospective, descriptive, and observational study was conducted on adults hospitalised due to COVID-19 in Spain who were included in the SEMI-COVID- 19 Registry from March to November 2020. Results: CTs were used in 6053 (36.21%) of the included patients. The patients were older (mean (SD)) (69.6 (14.6) vs. 66.0 (16.8) years; p < 0.001), with hypertension (57.0% vs. 47.7%; p < 0.001), obesity (26.4% vs. 19.3%; p < 0.0001), and multimorbidity prevalence (20.6% vs. 16.1%; p < 0.001). These patients had higher values (mean (95% CI)) of C-reactive protein (CRP) (86 (32.7-160) vs. 49.3 (16-109) mg/dL; p < 0.001), ferritin (791 (393-1534) vs. 470 (236- 996) µg/dL; p < 0.001), D dimer (750 (430-1400) vs. 617 (345-1180) µg/dL; p < 0.001), and lower Sp02/Fi02 (266 (91.1) vs. 301 (101); p < 0.001). Since June 2020, there was an increment in the use of CTs (March vs. September; p < 0.001). Overall, 20% did not receive steroids, and 40% received less than 200 mg accumulated prednisone equivalent dose (APED). Severe patients are treated with higher doses. The mortality benefit was observed in patients with oxygen saturation </=90%. Conclusions: Patients with greater comorbidity, severity, and inflammatory markers were those treated with CTs. In severe patients, there is a trend towards the use of higher doses. The mortality benefit was observed in patients with oxygen saturation </=90%

    Estudios de marcado y recaptura de especies marinas

    Get PDF
    Los resultados obtenidos del marcado y posterior recaptura de los ejemplares son una herramienta muy valiosa para contribuir a mejorar el conocimiento de la biología y ecología de una especie, examinando ciertos aspectos como son: el crecimiento, los movimientos o migraciones, la mortalidad o supervivencia, la abundancia y distribución de la especie, el hábitat y diferenciación de poblaciones o stocks. Actualmente la técnica de marcado se aplica a muchas especies, tanto terrestres como marinas, pertenecientes a diversos grupos zoológicos: peces, crustáceos, reptiles, moluscos y mamíferos. Este libro repasa algunos ejemplos de marcado de especies marinas de interés comercial. No todas las especies pueden ser marcadas, porque es necesario cumplir una serie de requisitos para poder llevar a cabo con éxito un experimento de marcado. En uno de los apartados de esta guía, se describen los distintos aspectos a tener en cuenta para obtener buenos resultados. Se describen los principales proyectos de marcado actualmente en ejecución o en marcha llevados a cabo por el Instituto Español de Oceanografía (IEO). En primer lugar, se describe brevemente la especie, su distribución, crecimiento, reproducción, alimentación, etc. A continuación, se presenta la información del marcado, es decir, campañas realizadas, número de ejemplares marcados y algunos de los resultados obtenidos hasta la fecha a partir de las recapturas disponibles. En algunas especies, los programas de marcado se llevan realizando desde hace más de 20 años, como es el caso del atún rojo, por lo que la información disponible es bastante amplia. En otros casos por el contrario como la merluza, los proyectos son relativamente recientes, no obstante los resultados son bastante interesantes y prometedores.Nowadays many different marine animals are being tagged. This book summarizes recent tagging programs carried out by the Spanish Institute of Oceanography (IEO). Although the objectives of these various studies mainly depend on the species and each project in particular, the general aim is to better understand the biology and ecology of these animals the structure and dynamics of their populations and their capacity to respond to human activities. This book provides an overview of different aspects of this technique such as a brief history of tagging, the types of tags currently used, including both conventional and electronic tags, where and how to put them on the marine animals, some recommendations regarding how to perform a tagging survey and where to go or what to do if anyone recovers a tagged fish or marine animal. The book then summarizes the main species tagged by the IEO, making a short description of their biology followed by some of the results obtained from tagging studies undertaken until now. Other applications are to know the spatial distribution (spawning or feeding areas), estimate growth parameters, mortality and survival rates, longevity, the size of the population or identifying stocks. Nowadays the advances in electronics have also open new fields such us the possibility of tracking an animal and knowing its habitat preferences and behaviour. Besides some of these tags have the capacity of recording this information during long periods and sending the data from long distances even without the need to recover the animal. Tagging activities constitute a very useful tool to improve the knowledge of many species and contribute to their management and conservation. For that reason this methodology is included in many IEO projects in which other activities like the monitoring of the fishery (landings, fishing effort, fleet characteristics, fishing areas, biological sampling, etc.) are carried out. Some projects are related with coastal pelagic fisheries including anchovy, sardine and mackerel or oceanic pelagic fisheries like tuna and billfish species and pelagic sharks. Others are focused on benthic and demersal species such as hake, black spot seabream, anglerfish, flatfish, etc. Nevertheless not all species can be tagged, as they have to survive being caught and handled before being release. For this reason, tagging techniques may not easily be applied to some species.Versión del edito

    International nosocomial infection control consortium (INICC) report, data summary of 36 countries, for 2004-2009

    Get PDF
    The results of a surveillance study conducted by the International Nosocomial Infection Control Consortium (INICC) from January 2004 through December 2009 in 422 intensive care units (ICUs) of 36 countries in Latin America, Asia, Africa, and Europe are reported. During the 6-year study period, using Centers for Disease Control and Prevention (CDC) National Healthcare Safety Network (NHSN; formerly the National Nosocomial Infection Surveillance system [NNIS]) definitions for device-associated health care-associated infections, we gathered prospective data from 313,008 patients hospitalized in the consortium's ICUs for an aggregate of 2,194,897 ICU bed-days. Despite the fact that the use of devices in the developing countries' ICUs was remarkably similar to that reported in US ICUs in the CDC's NHSN, rates of device-associated nosocomial infection were significantly higher in the ICUs of the INICC hospitals; the pooled rate of central line-associated bloodstream infection in the INICC ICUs of 6.8 per 1,000 central line-days was more than 3-fold higher than the 2.0 per 1,000 central line-days reported in comparable US ICUs. The overall rate of ventilator-associated pneumonia also was far higher (15.8 vs 3.3 per 1,000 ventilator-days), as was the rate of catheter-associated urinary tract infection (6.3 vs. 3.3 per 1,000 catheter-days). Notably, the frequencies of resistance of Pseudomonas aeruginosa isolates to imipenem (47.2% vs 23.0%), Klebsiella pneumoniae isolates to ceftazidime (76.3% vs 27.1%), Escherichia coli isolates to ceftazidime (66.7% vs 8.1%), Staphylococcus aureus isolates to methicillin (84.4% vs 56.8%), were also higher in the consortium's ICUs, and the crude unadjusted excess mortalities of device-related infections ranged from 7.3% (for catheter-associated urinary tract infection) to 15.2% (for ventilator-associated pneumonia). Copyright © 2012 by the Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved

    The forward physics facility at the high-luminosity LHC

    Get PDF
    High energy collisions at the High-Luminosity Large Hadron Collider (LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing LHC experiments. The proposed Forward Physics Facility (FPF), to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe standard model (SM) processes and search for physics beyond the standard model (BSM). In this report, we review the status of the civil engineering plans and the experiments to explore the diverse physics signals that can be uniquely probed in the forward region. FPF experiments will be sensitive to a broad range of BSM physics through searches for new particle scattering or decay signatures and deviations from SM expectations in high statistics analyses with TeV neutrinos in this low-background environment. High statistics neutrino detection will also provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. We report here on these physics topics, on infrastructure, detector, and simulation studies, and on future directions to realize the FPF's physics potential

    Overview of the JET results in support to ITER

    Get PDF

    Search for single vector-like B quark production and decay via B → bH(b¯b) in pp collisions at √s = 13 TeV with the ATLAS detector

    Get PDF
    A search is presented for single production of a vector-like B quark decaying into a Standard Model b-quark and a Standard Model Higgs boson, which decays into a b¯b pair. The search is carried out in 139 fb−1 of √s = 13 TeV proton-proton collision data collected by the ATLAS detector at the LHC between 2015 and 2018. No significant deviation from the Standard Model background prediction is observed, and mass-dependent exclusion limits at the 95% confidence level are set on the resonance production cross-section in several theoretical scenarios determined by the couplings cW, cZ and cH between the B quark and the Standard Model W, Z and Higgs bosons, respectively. For a vector-like B occurring as an isospin singlet, the search excludes values of cW greater than 0.45 for a B resonance mass (mB) between 1.0 and 1.2 TeV. For 1.2 TeV < mB < 2.0 TeV, cW values larger than 0.50–0.65 are excluded. If the B occurs as part of a (B, Y) doublet, the smallest excluded cZ coupling values range between 0.3 and 0.5 across the investigated resonance mass range 1.0 TeV < mB < 2.0 TeV

    Search for resonances decaying into photon pairs in 139 fb−1 of pp collisions at √s = 13 TeV with the ATLAS detector

    Get PDF
    Searches for new resonances in the diphoton final state, with spin 0 as predicted by theories with an extended Higgs sector and with spin 2 using a warped extra-dimension benchmark model, are presented using 139 fb−1 of √s = 13 TeV pp collision data collected by the ATLAS experiment at the LHC. No significant deviation from the Standard Model is observed and upper limits are placed on the production cross-section times branching ratio to two photons as a function of the resonance mass

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Beam-induced backgrounds measured in the ATLAS detector during local gas injection into the LHC beam vacuum

    Get PDF
    Inelastic beam-gas collisions at the Large Hadron Collider (LHC), within a few hundred metres of the ATLAS experiment, are known to give the dominant contribution to beam backgrounds. These are monitored by ATLAS with a dedicated Beam Conditions Monitor (BCM) and with the rate of fake jets in the calorimeters. These two methods are complementary since the BCM probes backgrounds just around the beam pipe while fake jets are observed at radii of up to several metres. In order to quantify the correlation between the residual gas density in the LHC beam vacuum and the experimental backgrounds recorded by ATLAS, several dedicated tests were performed during LHC Run 2. Local pressure bumps, with a gas density several orders of magnitude higher than during normal operation, were introduced at different locations. The changes of beam-related backgrounds, seen in ATLAS, are correlated with the local pressure variation. In addition the rates of beam-gas events are estimated from the pressure measurements and pressure bump profiles obtained from calculations. Using these rates, the efficiency of the ATLAS beam background monitors to detect beam-gas events is derived as a function of distance from the interaction point. These efficiencies and characteristic distributions of fake jets from the beam backgrounds are found to be in good agreement with results of beam-gas simulations performed with theFluka Monte Carlo programme

    Search for boosted diphoton resonances in the 10 to 70 GeV mass range using 138 fb−1 of 13 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for diphoton resonances in the mass range between 10 and 70 GeV with the ATLAS experiment at the Large Hadron Collider (LHC) is presented. The analysis is based on pp collision data corresponding to an integrated luminosity of 138 fb−1 at a centre-of-mass energy of 13 TeV recorded from 2015 to 2018. Previous searches for diphoton resonances at the LHC have explored masses down to 65 GeV, finding no evidence of new particles. This search exploits the particular kinematics of events with pairs of closely spaced photons reconstructed in the detector, allowing examination of invariant masses down to 10 GeV. The presented strategy covers a region previously unexplored at hadron colliders because of the experimental challenges of recording low-energy photons and estimating the backgrounds. No significant excess is observed and the reported limits provide the strongest bound on promptly decaying axion-like particles coupling to gluons and photons for masses between 10 and 70 GeV
    corecore