117 research outputs found

    Examining oral pre-exposure prophylaxis (PrEP) literacy among participants in an HIV vaccine trial preparedness cohort study

    Get PDF
    Background: PrEP literacy is influenced by many factors including the types of information available and how it is interpreted. The level of PrEP literacy may influence acceptability and uptake. Methods: We conducted 25 in-depth interviews in a HIV vaccine trial preparedness cohort study. We explored what participants knew about PrEP, sources of PrEP knowledge and how much they know about PrEP. We used the framework approach to generate themes for analysis guided by the Social Ecological Model and examined levels of PrEP literacy using the individual and interpersonal constructs of the SEM. Results: We found that PrEP awareness is strongly influenced by external factors such as social media and how much participants know about HIV treatment and prevention in the local community. However, while participants highlighted the importance of the internet/social media as a source of information about PrEP they talked of low PrEP literacy in their communities. Participants indicated that their own knowledge came as a result of joining the HIV vaccine trial preparedness study. However, some expressed doubts about the effectiveness of the drug and worried about side effects. Participants commented that at the community level PrEP was associated with being sexually active, because it was used to prevent the sexual transmission of HIV. As a result, some participants commented that one could feel judged by the health workers for asking for PrEP at health facilities in the community. Conclusion: The information collected in this study provided an understanding of the different layers of influence around individuals that are important to address to improve PrEP acceptability and uptake. Our findings can inform strategies to address the barriers to PrEP uptake, particularly at structural and community levels. Trial registration: https://clinicaltrials.gov/ct2/show/NCT04066881

    A chemical survey of exoplanets with ARIEL

    Get PDF
    Thousands of exoplanets have now been discovered with a huge range of masses, sizes and orbits: from rocky Earth-like planets to large gas giants grazing the surface of their host star. However, the essential nature of these exoplanets remains largely mysterious: there is no known, discernible pattern linking the presence, size, or orbital parameters of a planet to the nature of its parent star. We have little idea whether the chemistry of a planet is linked to its formation environment, or whether the type of host star drives the physics and chemistry of the planet’s birth, and evolution. ARIEL was conceived to observe a large number (~1000) of transiting planets for statistical understanding, including gas giants, Neptunes, super-Earths and Earth-size planets around a range of host star types using transit spectroscopy in the 1.25–7.8 μm spectral range and multiple narrow-band photometry in the optical. ARIEL will focus on warm and hot planets to take advantage of their well-mixed atmospheres which should show minimal condensation and sequestration of high-Z materials compared to their colder Solar System siblings. Said warm and hot atmospheres are expected to be more representative of the planetary bulk composition. Observations of these warm/hot exoplanets, and in particular of their elemental composition (especially C, O, N, S, Si), will allow the understanding of the early stages of planetary and atmospheric formation during the nebular phase and the following few million years. ARIEL will thus provide a representative picture of the chemical nature of the exoplanets and relate this directly to the type and chemical environment of the host star. ARIEL is designed as a dedicated survey mission for combined-light spectroscopy, capable of observing a large and well-defined planet sample within its 4-year mission lifetime. Transit, eclipse and phase-curve spectroscopy methods, whereby the signal from the star and planet are differentiated using knowledge of the planetary ephemerides, allow us to measure atmospheric signals from the planet at levels of 10–100 part per million (ppm) relative to the star and, given the bright nature of targets, also allows more sophisticated techniques, such as eclipse mapping, to give a deeper insight into the nature of the atmosphere. These types of observations require a stable payload and satellite platform with broad, instantaneous wavelength coverage to detect many molecular species, probe the thermal structure, identify clouds and monitor the stellar activity. The wavelength range proposed covers all the expected major atmospheric gases from e.g. H2O, CO2, CH4 NH3, HCN, H2S through to the more exotic metallic compounds, such as TiO, VO, and condensed species. Simulations of ARIEL performance in conducting exoplanet surveys have been performed – using conservative estimates of mission performance and a full model of all significant noise sources in the measurement – using a list of potential ARIEL targets that incorporates the latest available exoplanet statistics. The conclusion at the end of the Phase A study, is that ARIEL – in line with the stated mission objectives – will be able to observe about 1000 exoplanets depending on the details of the adopted survey strategy, thus confirming the feasibility of the main science objectives.Peer reviewedFinal Published versio

    Effects of antiplatelet therapy on stroke risk by brain imaging features of intracerebral haemorrhage and cerebral small vessel diseases: subgroup analyses of the RESTART randomised, open-label trial

    Get PDF
    Background Findings from the RESTART trial suggest that starting antiplatelet therapy might reduce the risk of recurrent symptomatic intracerebral haemorrhage compared with avoiding antiplatelet therapy. Brain imaging features of intracerebral haemorrhage and cerebral small vessel diseases (such as cerebral microbleeds) are associated with greater risks of recurrent intracerebral haemorrhage. We did subgroup analyses of the RESTART trial to explore whether these brain imaging features modify the effects of antiplatelet therapy

    SARS-CoV-2 RNA detected in blood products from patients with COVID-19 is not associated with infectious virus

    Get PDF
    Background: Laboratory diagnosis of SARS-CoV-2 infection (the cause of COVID-19) uses PCR to detect viral RNA (vRNA) in respiratory samples. SARS-CoV-2 RNA has also been detected in other sample types, but there is limited understanding of the clinical or laboratory significance of its detection in blood. Methods: We undertook a systematic literature review to assimilate the evidence for the frequency of vRNA in blood, and to identify associated clinical characteristics. We performed RT-PCR in serum samples from a UK clinical cohort of acute and convalescent COVID-19 cases (n=212), together with convalescent plasma samples collected by NHS Blood and Transplant (NHSBT) (n=462 additional samples). To determine whether PCR-positive blood samples could pose an infection risk, we attempted virus isolation from a subset of RNA-positive samples. Results: We identified 28 relevant studies, reporting SARS-CoV-2 RNA in 0-76% of blood samples; pooled estimate 10% (95%CI 5-18%). Among serum samples from our clinical cohort, 27/212 (12.7%) had SARS-CoV-2 RNA detected by RT-PCR. RNA detection occurred in samples up to day 20 post symptom onset, and was associated with more severe disease (multivariable odds ratio 7.5). Across all samples collected ≥28 days post symptom onset, 0/494 (0%, 95%CI 0-0.7%) had vRNA detected. Among our PCR-positive samples, cycle threshold (ct) values were high (range 33.5-44.8), suggesting low vRNA copy numbers. PCR-positive sera inoculated into cell culture did not produce any cytopathic effect or yield an increase in detectable SARS-CoV-2 RNA. Conclusions: vRNA was detectable at low viral loads in a minority of serum samples collected in acute infection, but was not associated with infectious SARS-CoV-2 (within the limitations of the assays used). This work helps to inform biosafety precautions for handling blood products from patients with current or previous COVID-19

    Large-Scale Gene-Centric Meta-Analysis across 39 Studies Identifies Type 2 Diabetes Loci

    Get PDF
    To identify genetic factors contributing to type 2 diabetes (T2D), we performed large-scale meta-analyses by using a custom similar to 50,000 SNP genotyping array (the ITMAT-Broad-CARe array) with similar to 2000 candidate genes in 39 multiethnic population-based studies, case-control studies, and clinical trials totaling 17,418 cases and 70,298 controls. First, meta-analysis of 25 studies comprising 14,073 cases and 57,489 controls of European descent confirmed eight established T2D loci at genome-wide significance. In silico follow-up analysis of putative association signals found in independent genome-wide association studies (including 8,130 cases and 38,987 controls) performed by the DIAGRAM consortium identified a T2D locus at genome-wide significance (GATAD2A/CILP2/PBX4; p = 5.7 x 10(-9)) and two loci exceeding study-wide significance (SREBF1, and TH/INS; p <2.4 x 10(-6)). Second, meta-analyses of 1,986 cases and 7,695 controls from eight African-American studies identified study-wide-significant (p = 2.4 x 10(-7)) variants in HMGA2 and replicated variants in TCF7L2 (p = 5.1 x 10(-15)). Third, conditional analysis revealed multiple known and novel independent signals within five T2D-associated genes in samples of European ancestry and within HMGA2 in African-American samples. Fourth, a multiethnic meta-analysis of all 39 studies identified T2D-associated variants in BCL2 (p = 2.1 x 10(-8)). Finally, a composite genetic score of SNPs from new and established T2D signals was significantly associated with increased risk of diabetes in African-American, Hispanic, and Asian populations. In summary, large-scale meta-analysis involving a dense gene-centric approach has uncovered additional loci and variants that contribute to T2D risk and suggests substantial overlap of T2D association signals across multiple ethnic groups

    Measurement-Induced State Transitions in a Superconducting Qubit: Within the Rotating Wave Approximation

    Full text link
    Superconducting qubits typically use a dispersive readout scheme, where a resonator is coupled to a qubit such that its frequency is qubit-state dependent. Measurement is performed by driving the resonator, where the transmitted resonator field yields information about the resonator frequency and thus the qubit state. Ideally, we could use arbitrarily strong resonator drives to achieve a target signal-to-noise ratio in the shortest possible time. However, experiments have shown that when the average resonator photon number exceeds a certain threshold, the qubit is excited out of its computational subspace, which we refer to as a measurement-induced state transition. These transitions degrade readout fidelity, and constitute leakage which precludes further operation of the qubit in, for example, error correction. Here we study these transitions using a transmon qubit by experimentally measuring their dependence on qubit frequency, average photon number, and qubit state, in the regime where the resonator frequency is lower than the qubit frequency. We observe signatures of resonant transitions between levels in the coupled qubit-resonator system that exhibit noisy behavior when measured repeatedly in time. We provide a semi-classical model of these transitions based on the rotating wave approximation and use it to predict the onset of state transitions in our experiments. Our results suggest the transmon is excited to levels near the top of its cosine potential following a state transition, where the charge dispersion of higher transmon levels explains the observed noisy behavior of state transitions. Moreover, occupation in these higher energy levels poses a major challenge for fast qubit reset

    Measurement-induced entanglement and teleportation on a noisy quantum processor

    Full text link
    Measurement has a special role in quantum theory: by collapsing the wavefunction it can enable phenomena such as teleportation and thereby alter the "arrow of time" that constrains unitary evolution. When integrated in many-body dynamics, measurements can lead to emergent patterns of quantum information in space-time that go beyond established paradigms for characterizing phases, either in or out of equilibrium. On present-day NISQ processors, the experimental realization of this physics is challenging due to noise, hardware limitations, and the stochastic nature of quantum measurement. Here we address each of these experimental challenges and investigate measurement-induced quantum information phases on up to 70 superconducting qubits. By leveraging the interchangeability of space and time, we use a duality mapping, to avoid mid-circuit measurement and access different manifestations of the underlying phases -- from entanglement scaling to measurement-induced teleportation -- in a unified way. We obtain finite-size signatures of a phase transition with a decoding protocol that correlates the experimental measurement record with classical simulation data. The phases display sharply different sensitivity to noise, which we exploit to turn an inherent hardware limitation into a useful diagnostic. Our work demonstrates an approach to realize measurement-induced physics at scales that are at the limits of current NISQ processors
    corecore