281 research outputs found

    Improved Methods for Acrylic-Free Implants in Non-Human Primates for Neuroscience Research

    Get PDF
    Traditionally, head fixation devices and recording cylinders have been implanted in nonhuman primates (NHP) using dental acrylic despite several shortcomings associated with acrylic. The use of more biocompatible materials such as titanium and PEEK is becoming more prevalent in NHP research. We describe a cost effective set of procedures that maximizes the integration of headposts and recording cylinders with the animal’s tissues while reducing surgery time. Nine rhesus monkeys were implanted with titanium headposts, and one of these was also implanted with a recording chamber. In each case, a three-dimensional printed replica of the skull was created based on computerized tomography scans. The titanium feet of the headposts were shaped, and the skull thickness was measured preoperatively, reducing surgery time by up to 70%. The recording cylinder was manufactured to conform tightly to the skull, which was fastened to the skull with four screws and remained watertight for 8.5 mo. We quantified the amount of regression of the skin edge at the headpost. We found a large degree of variability in the timing and extent of skin regression that could not be explained by any single recorded factor. However, there was not a single case of bone exposure; although skin retracted from the titanium, skin also remained adhered to the skull adjacent to those regions. The headposts remained fully functional and free of complications for the experimental life of each animal, several of which are still participating in experiments more than 4 yr after implant

    Measurement of the cosmic ray spectrum above 4×10184{\times}10^{18} eV using inclined events detected with the Pierre Auger Observatory

    Full text link
    A measurement of the cosmic-ray spectrum for energies exceeding 4×10184{\times}10^{18} eV is presented, which is based on the analysis of showers with zenith angles greater than 6060^{\circ} detected with the Pierre Auger Observatory between 1 January 2004 and 31 December 2013. The measured spectrum confirms a flux suppression at the highest energies. Above 5.3×10185.3{\times}10^{18} eV, the "ankle", the flux can be described by a power law EγE^{-\gamma} with index γ=2.70±0.02(stat)±0.1(sys)\gamma=2.70 \pm 0.02 \,\text{(stat)} \pm 0.1\,\text{(sys)} followed by a smooth suppression region. For the energy (EsE_\text{s}) at which the spectral flux has fallen to one-half of its extrapolated value in the absence of suppression, we find Es=(5.12±0.25(stat)1.2+1.0(sys))×1019E_\text{s}=(5.12\pm0.25\,\text{(stat)}^{+1.0}_{-1.2}\,\text{(sys)}){\times}10^{19} eV.Comment: Replaced with published version. Added journal reference and DO

    Energy Estimation of Cosmic Rays with the Engineering Radio Array of the Pierre Auger Observatory

    Full text link
    The Auger Engineering Radio Array (AERA) is part of the Pierre Auger Observatory and is used to detect the radio emission of cosmic-ray air showers. These observations are compared to the data of the surface detector stations of the Observatory, which provide well-calibrated information on the cosmic-ray energies and arrival directions. The response of the radio stations in the 30 to 80 MHz regime has been thoroughly calibrated to enable the reconstruction of the incoming electric field. For the latter, the energy deposit per area is determined from the radio pulses at each observer position and is interpolated using a two-dimensional function that takes into account signal asymmetries due to interference between the geomagnetic and charge-excess emission components. The spatial integral over the signal distribution gives a direct measurement of the energy transferred from the primary cosmic ray into radio emission in the AERA frequency range. We measure 15.8 MeV of radiation energy for a 1 EeV air shower arriving perpendicularly to the geomagnetic field. This radiation energy -- corrected for geometrical effects -- is used as a cosmic-ray energy estimator. Performing an absolute energy calibration against the surface-detector information, we observe that this radio-energy estimator scales quadratically with the cosmic-ray energy as expected for coherent emission. We find an energy resolution of the radio reconstruction of 22% for the data set and 17% for a high-quality subset containing only events with at least five radio stations with signal.Comment: Replaced with published version. Added journal reference and DO

    Measurement of the Radiation Energy in the Radio Signal of Extensive Air Showers as a Universal Estimator of Cosmic-Ray Energy

    Full text link
    We measure the energy emitted by extensive air showers in the form of radio emission in the frequency range from 30 to 80 MHz. Exploiting the accurate energy scale of the Pierre Auger Observatory, we obtain a radiation energy of 15.8 \pm 0.7 (stat) \pm 6.7 (sys) MeV for cosmic rays with an energy of 1 EeV arriving perpendicularly to a geomagnetic field of 0.24 G, scaling quadratically with the cosmic-ray energy. A comparison with predictions from state-of-the-art first-principle calculations shows agreement with our measurement. The radiation energy provides direct access to the calorimetric energy in the electromagnetic cascade of extensive air showers. Comparison with our result thus allows the direct calibration of any cosmic-ray radio detector against the well-established energy scale of the Pierre Auger Observatory.Comment: Replaced with published version. Added journal reference and DOI. Supplemental material in the ancillary file

    Measurement-Induced State Transitions in a Superconducting Qubit: Within the Rotating Wave Approximation

    Full text link
    Superconducting qubits typically use a dispersive readout scheme, where a resonator is coupled to a qubit such that its frequency is qubit-state dependent. Measurement is performed by driving the resonator, where the transmitted resonator field yields information about the resonator frequency and thus the qubit state. Ideally, we could use arbitrarily strong resonator drives to achieve a target signal-to-noise ratio in the shortest possible time. However, experiments have shown that when the average resonator photon number exceeds a certain threshold, the qubit is excited out of its computational subspace, which we refer to as a measurement-induced state transition. These transitions degrade readout fidelity, and constitute leakage which precludes further operation of the qubit in, for example, error correction. Here we study these transitions using a transmon qubit by experimentally measuring their dependence on qubit frequency, average photon number, and qubit state, in the regime where the resonator frequency is lower than the qubit frequency. We observe signatures of resonant transitions between levels in the coupled qubit-resonator system that exhibit noisy behavior when measured repeatedly in time. We provide a semi-classical model of these transitions based on the rotating wave approximation and use it to predict the onset of state transitions in our experiments. Our results suggest the transmon is excited to levels near the top of its cosine potential following a state transition, where the charge dispersion of higher transmon levels explains the observed noisy behavior of state transitions. Moreover, occupation in these higher energy levels poses a major challenge for fast qubit reset

    Overcoming leakage in scalable quantum error correction

    Full text link
    Leakage of quantum information out of computational states into higher energy states represents a major challenge in the pursuit of quantum error correction (QEC). In a QEC circuit, leakage builds over time and spreads through multi-qubit interactions. This leads to correlated errors that degrade the exponential suppression of logical error with scale, challenging the feasibility of QEC as a path towards fault-tolerant quantum computation. Here, we demonstrate the execution of a distance-3 surface code and distance-21 bit-flip code on a Sycamore quantum processor where leakage is removed from all qubits in each cycle. This shortens the lifetime of leakage and curtails its ability to spread and induce correlated errors. We report a ten-fold reduction in steady-state leakage population on the data qubits encoding the logical state and an average leakage population of less than 1×1031 \times 10^{-3} throughout the entire device. The leakage removal process itself efficiently returns leakage population back to the computational basis, and adding it to a code circuit prevents leakage from inducing correlated error across cycles, restoring a fundamental assumption of QEC. With this demonstration that leakage can be contained, we resolve a key challenge for practical QEC at scale.Comment: Main text: 7 pages, 5 figure

    Measuring performance on the Healthcare Access and Quality Index for 195 countries and territories and selected subnational locations: a systematic analysis from the Global Burden of Disease Study 2016

    Get PDF
    Background A key component of achieving universal health coverage is ensuring that all populations have access to quality health care. Examining where gains have occurred or progress has faltered across and within countries is crucial to guiding decisions and strategies for future improvement. We used the Global Burden of Diseases, Injuries, and Risk Factors Study 2016 (GBD 2016) to assess personal health-care access and quality with the Healthcare Access and Quality (HAQ) Index for 195 countries and territories, as well as subnational locations in seven countries, from 1990 to 2016.Methods Drawing from established methods and updated estimates from GBD 2016, we used 32 causes from which death should not occur in the presence of effective care to approximate personal health-care access and quality by location and over time. To better isolate potential effects of personal health-care access and quality from underlying risk factor patterns, we risk-standardised cause-specific deaths due to non-cancers by location-year, replacing the local joint exposure of environmental and behavioural risks with the global level of exposure. Supported by the expansion of cancer registry data in GBD 2016, we used mortality-to-incidence ratios for cancers instead of risk-standardised death rates to provide a stronger signal of the effects of personal health care and access on cancer survival. We transformed each cause to a scale of 0–100, with 0 as the first percentile (worst) observed between 1990 and 2016, and 100 as the 99th percentile (best); we set these thresholds at the country level, and then applied them to subnational locations. We applied a principal components analysis to construct the HAQ Index using all scaled cause values, providing an overall score of 0–100 of personal health-care access and quality by location over time. We then compared HAQ Index levels and trends by quintiles on the Socio-demographic Index (SDI), a summary measure of overall development. As derived from the broader GBD study and other data sources, we examined relationships between national HAQ Index scores and potential correlates of performance, such as total health spending per capita.Background A key component of achieving universal health coverage is ensuring that all populations have access to quality health care. Examining where gains have occurred or progress has faltered across and within countries is crucial to guiding decisions and strategies for future improvement. We used the Global Burden of Diseases, Injuries, and Risk Factors Study 2016 (GBD 2016) to assess personal health-care access and quality with the Healthcare Access and Quality (HAQ) Index for 195 countries and territories, as well as subnational locations in seven countries, from 1990 to 2016.Methods Drawing from established methods and updated estimates from GBD 2016, we used 32 causes from which death should not occur in the presence of effective care to approximate personal health-care access and quality by location and over time. To better isolate potential effects of personal health-care access and quality from underlying risk factor patterns, we risk-standardised cause-specific deaths due to non-cancers by location-year, replacing the local joint exposure of environmental and behavioural risks with the global level of exposure. Supported by the expansion of cancer registry data in GBD 2016, we used mortality-to-incidence ratios for cancers instead of risk-standardised death rates to provide a stronger signal of the effects of personal health care and access on cancer survival. We transformed each cause to a scale of 0–100, with 0 as the first percentile (worst) observed between 1990 and 2016, and 100 as the 99th percentile (best); we set these thresholds at the country level, and then applied them to subnational locations. We applied a principal components analysis to construct the HAQ Index using all scaled cause values, providing an overall score of 0–100 of personal health-care access and quality by location over time. We then compared HAQ Index levels and trends by quintiles on the Socio-demographic Index (SDI), a summary measure of overall development. As derived from the broader GBD study and other data sources, we examined relationships between national HAQ Index scores and potential correlates of performance, such as total health spending per capita

    Association between convalescent plasma treatment and mortality in COVID-19: a collaborative systematic review and meta-analysis of randomized clinical trials.

    Get PDF
    Funder: laura and john arnold foundationBACKGROUND: Convalescent plasma has been widely used to treat COVID-19 and is under investigation in numerous randomized clinical trials, but results are publicly available only for a small number of trials. The objective of this study was to assess the benefits of convalescent plasma treatment compared to placebo or no treatment and all-cause mortality in patients with COVID-19, using data from all available randomized clinical trials, including unpublished and ongoing trials (Open Science Framework, https://doi.org/10.17605/OSF.IO/GEHFX ). METHODS: In this collaborative systematic review and meta-analysis, clinical trial registries (ClinicalTrials.gov, WHO International Clinical Trials Registry Platform), the Cochrane COVID-19 register, the LOVE database, and PubMed were searched until April 8, 2021. Investigators of trials registered by March 1, 2021, without published results were contacted via email. Eligible were ongoing, discontinued and completed randomized clinical trials that compared convalescent plasma with placebo or no treatment in COVID-19 patients, regardless of setting or treatment schedule. Aggregated mortality data were extracted from publications or provided by investigators of unpublished trials and combined using the Hartung-Knapp-Sidik-Jonkman random effects model. We investigated the contribution of unpublished trials to the overall evidence. RESULTS: A total of 16,477 patients were included in 33 trials (20 unpublished with 3190 patients, 13 published with 13,287 patients). 32 trials enrolled only hospitalized patients (including 3 with only intensive care unit patients). Risk of bias was low for 29/33 trials. Of 8495 patients who received convalescent plasma, 1997 died (23%), and of 7982 control patients, 1952 died (24%). The combined risk ratio for all-cause mortality was 0.97 (95% confidence interval: 0.92; 1.02) with between-study heterogeneity not beyond chance (I2 = 0%). The RECOVERY trial had 69.8% and the unpublished evidence 25.3% of the weight in the meta-analysis. CONCLUSIONS: Convalescent plasma treatment of patients with COVID-19 did not reduce all-cause mortality. These results provide strong evidence that convalescent plasma treatment for patients with COVID-19 should not be used outside of randomized trials. Evidence synthesis from collaborations among trial investigators can inform both evidence generation and evidence application in patient care

    The SuperCam Instrument Suite on the Mars 2020 Rover: Science Objectives and Mast-Unit Description

    Get PDF
    On the NASA 2020 rover mission to Jezero crater, the remote determination of the texture, mineralogy and chemistry of rocks is essential to quickly and thoroughly characterize an area and to optimize the selection of samples for return to Earth. As part of the Perseverance payload, SuperCam is a suite of five techniques that provide critical and complementary observations via Laser-Induced Breakdown Spectroscopy (LIBS), Time-Resolved Raman and Luminescence (TRR/L), visible and near-infrared spectroscopy (VISIR), high-resolution color imaging (RMI), and acoustic recording (MIC). SuperCam operates at remote distances, primarily 2-7 m, while providing data at sub-mm to mm scales. We report on SuperCam's science objectives in the context of the Mars 2020 mission goals and ways the different techniques can address these questions. The instrument is made up of three separate subsystems: the Mast Unit is designed and built in France; the Body Unit is provided by the United States; the calibration target holder is contributed by Spain, and the targets themselves by the entire science team. This publication focuses on the design, development, and tests of the Mast Unit; companion papers describe the other units. The goal of this work is to provide an understanding of the technical choices made, the constraints that were imposed, and ultimately the validated performance of the flight model as it leaves Earth, and it will serve as the foundation for Mars operations and future processing of the data.In France was provided by the Centre National d'Etudes Spatiales (CNES). Human resources were provided in part by the Centre National de la Recherche Scientifique (CNRS) and universities. Funding was provided in the US by NASA's Mars Exploration Program. Some funding of data analyses at Los Alamos National Laboratory (LANL) was provided by laboratory-directed research and development funds
    corecore