68 research outputs found

    Influenza A Virus Challenge Models in Cynomolgus Macaques Using the Authentic Inhaled Aerosol and Intra-Nasal Routes of Infection

    Get PDF
    Non-human primates are the animals closest to humans for use in influenza A virus challenge studies, in terms of their phylogenetic relatedness, physiology and immune systems. Previous studies have shown that cynomolgus macaques (Macaca fascicularis) are permissive for infection with H1N1pdm influenza virus. These studies have typically used combined challenge routes, with the majority being intra-tracheal delivery, and high doses of virus (> 107 infectious units). This paper describes the outcome of novel challenge routes (inhaled aerosol, intra-nasal instillation) and low to moderate doses (103 to 106 plaque forming units) of H1N1pdm virus in cynomolgus macaques. Evidence of virus replication and sero-conversion were detected in all four challenge groups, although the disease was sub-clinical. Intra-nasal challenge led to an infection confined to the nasal cavity. A low dose (103 plaque forming units) did not lead to detectable infectious virus shedding, but a 1000-fold higher dose led to virus shedding in all intra-nasal challenged animals. In contrast, aerosol and intra-tracheal challenge routes led to infections throughout the respiratory tract, although shedding from the nasal cavity was less reproducible between animals compared to the high-dose intra-nasal challenge group. Intra-tracheal and aerosol challenges induced a transient lymphopaenia, similar to that observed in influenza-infected humans, and greater virus-specific cellular immune responses in the blood were observed in these groups in comparison to the intra-nasal challenge groups. Activation of lung macrophages and innate immune response genes was detected at days 5 to 7 post-challenge. The kinetics of infection, both virological and immunological, were broadly in line with human influenza A virus infections. These more authentic infection models will be valuable in the determination of anti-influenza efficacy of novel entities against less severe (and thus more common) influenza infections

    Deprescribing benzodiazepines and Z-drugs in community-dwelling adults: a scoping review

    Get PDF

    Earth: Atmospheric Evolution of a Habitable Planet

    Full text link
    Our present-day atmosphere is often used as an analog for potentially habitable exoplanets, but Earth's atmosphere has changed dramatically throughout its 4.5 billion year history. For example, molecular oxygen is abundant in the atmosphere today but was absent on the early Earth. Meanwhile, the physical and chemical evolution of Earth's atmosphere has also resulted in major swings in surface temperature, at times resulting in extreme glaciation or warm greenhouse climates. Despite this dynamic and occasionally dramatic history, the Earth has been persistently habitable--and, in fact, inhabited--for roughly 4 billion years. Understanding Earth's momentous changes and its enduring habitability is essential as a guide to the diversity of habitable planetary environments that may exist beyond our solar system and for ultimately recognizing spectroscopic fingerprints of life elsewhere in the Universe. Here, we review long-term trends in the composition of Earth's atmosphere as it relates to both planetary habitability and inhabitation. We focus on gases that may serve as habitability markers (CO2, N2) or biosignatures (CH4, O2), especially as related to the redox evolution of the atmosphere and the coupled evolution of Earth's climate system. We emphasize that in the search for Earth-like planets we must be mindful that the example provided by the modern atmosphere merely represents a single snapshot of Earth's long-term evolution. In exploring the many former states of our own planet, we emphasize Earth's atmospheric evolution during the Archean, Proterozoic, and Phanerozoic eons, but we conclude with a brief discussion of potential atmospheric trajectories into the distant future, many millions to billions of years from now. All of these 'Alternative Earth' scenarios provide insight to the potential diversity of Earth-like, habitable, and inhabited worlds.Comment: 34 pages, 4 figures, 4 tables. Review chapter to appear in Handbook of Exoplanet

    Intraperitoneal drain placement and outcomes after elective colorectal surgery: international matched, prospective, cohort study

    Get PDF
    Despite current guidelines, intraperitoneal drain placement after elective colorectal surgery remains widespread. Drains were not associated with earlier detection of intraperitoneal collections, but were associated with prolonged hospital stay and increased risk of surgical-site infections.Background Many surgeons routinely place intraperitoneal drains after elective colorectal surgery. However, enhanced recovery after surgery guidelines recommend against their routine use owing to a lack of clear clinical benefit. This study aimed to describe international variation in intraperitoneal drain placement and the safety of this practice. Methods COMPASS (COMPlicAted intra-abdominal collectionS after colorectal Surgery) was a prospective, international, cohort study which enrolled consecutive adults undergoing elective colorectal surgery (February to March 2020). The primary outcome was the rate of intraperitoneal drain placement. Secondary outcomes included: rate and time to diagnosis of postoperative intraperitoneal collections; rate of surgical site infections (SSIs); time to discharge; and 30-day major postoperative complications (Clavien-Dindo grade at least III). After propensity score matching, multivariable logistic regression and Cox proportional hazards regression were used to estimate the independent association of the secondary outcomes with drain placement. Results Overall, 1805 patients from 22 countries were included (798 women, 44.2 per cent; median age 67.0 years). The drain insertion rate was 51.9 per cent (937 patients). After matching, drains were not associated with reduced rates (odds ratio (OR) 1.33, 95 per cent c.i. 0.79 to 2.23; P = 0.287) or earlier detection (hazard ratio (HR) 0.87, 0.33 to 2.31; P = 0.780) of collections. Although not associated with worse major postoperative complications (OR 1.09, 0.68 to 1.75; P = 0.709), drains were associated with delayed hospital discharge (HR 0.58, 0.52 to 0.66; P < 0.001) and an increased risk of SSIs (OR 2.47, 1.50 to 4.05; P < 0.001). Conclusion Intraperitoneal drain placement after elective colorectal surgery is not associated with earlier detection of postoperative collections, but prolongs hospital stay and increases SSI risk

    The Triggering Receptor Expressed on Myeloid Cells 2 Inhibits Complement Component 1q Effector Mechanisms and Exerts Detrimental Effects during Pneumococcal Pneumonia

    Get PDF
    Phagocytosis and inflammation within the lungs is crucial for host defense during bacterial pneumonia. Triggering receptor expressed on myeloid cells (TREM)-2 was proposed to negatively regulate TLR-mediated responses and enhance phagocytosis by macrophages, but the role of TREM-2 in respiratory tract infections is unknown. Here, we established the presence of TREM-2 on alveolar macrophages (AM) and explored the function of TREM-2 in the innate immune response to pneumococcal infection in vivo. Unexpectedly, we found Trem-2(-/-) AM to display augmented bacterial phagocytosis in vitro and in vivo compared to WT AM. Mechanistically, we detected that in the absence of TREM-2, pulmonary macrophages selectively produced elevated complement component 1q (C1q) levels. We found that these increased C1q levels depended on peroxisome proliferator-activated receptor-δ (PPAR-δ) activity and were responsible for the enhanced phagocytosis of bacteria. Upon infection with S. pneumoniae, Trem-2(-/-) mice exhibited an augmented bacterial clearance from lungs, decreased bacteremia and improved survival compared to their WT counterparts. This work is the first to disclose a role for TREM-2 in clinically relevant respiratory tract infections and demonstrates a previously unknown link between TREM-2 and opsonin production within the lungs

    Single-dose administration and the influence of the timing of the booster dose on immunogenicity and efficacy of ChAdOx1 nCoV-19 (AZD1222) vaccine: a pooled analysis of four randomised trials.

    Get PDF
    BACKGROUND: The ChAdOx1 nCoV-19 (AZD1222) vaccine has been approved for emergency use by the UK regulatory authority, Medicines and Healthcare products Regulatory Agency, with a regimen of two standard doses given with an interval of 4-12 weeks. The planned roll-out in the UK will involve vaccinating people in high-risk categories with their first dose immediately, and delivering the second dose 12 weeks later. Here, we provide both a further prespecified pooled analysis of trials of ChAdOx1 nCoV-19 and exploratory analyses of the impact on immunogenicity and efficacy of extending the interval between priming and booster doses. In addition, we show the immunogenicity and protection afforded by the first dose, before a booster dose has been offered. METHODS: We present data from three single-blind randomised controlled trials-one phase 1/2 study in the UK (COV001), one phase 2/3 study in the UK (COV002), and a phase 3 study in Brazil (COV003)-and one double-blind phase 1/2 study in South Africa (COV005). As previously described, individuals 18 years and older were randomly assigned 1:1 to receive two standard doses of ChAdOx1 nCoV-19 (5 × 1010 viral particles) or a control vaccine or saline placebo. In the UK trial, a subset of participants received a lower dose (2·2 × 1010 viral particles) of the ChAdOx1 nCoV-19 for the first dose. The primary outcome was virologically confirmed symptomatic COVID-19 disease, defined as a nucleic acid amplification test (NAAT)-positive swab combined with at least one qualifying symptom (fever ≥37·8°C, cough, shortness of breath, or anosmia or ageusia) more than 14 days after the second dose. Secondary efficacy analyses included cases occuring at least 22 days after the first dose. Antibody responses measured by immunoassay and by pseudovirus neutralisation were exploratory outcomes. All cases of COVID-19 with a NAAT-positive swab were adjudicated for inclusion in the analysis by a masked independent endpoint review committee. The primary analysis included all participants who were SARS-CoV-2 N protein seronegative at baseline, had had at least 14 days of follow-up after the second dose, and had no evidence of previous SARS-CoV-2 infection from NAAT swabs. Safety was assessed in all participants who received at least one dose. The four trials are registered at ISRCTN89951424 (COV003) and ClinicalTrials.gov, NCT04324606 (COV001), NCT04400838 (COV002), and NCT04444674 (COV005). FINDINGS: Between April 23 and Dec 6, 2020, 24 422 participants were recruited and vaccinated across the four studies, of whom 17 178 were included in the primary analysis (8597 receiving ChAdOx1 nCoV-19 and 8581 receiving control vaccine). The data cutoff for these analyses was Dec 7, 2020. 332 NAAT-positive infections met the primary endpoint of symptomatic infection more than 14 days after the second dose. Overall vaccine efficacy more than 14 days after the second dose was 66·7% (95% CI 57·4-74·0), with 84 (1·0%) cases in the 8597 participants in the ChAdOx1 nCoV-19 group and 248 (2·9%) in the 8581 participants in the control group. There were no hospital admissions for COVID-19 in the ChAdOx1 nCoV-19 group after the initial 21-day exclusion period, and 15 in the control group. 108 (0·9%) of 12 282 participants in the ChAdOx1 nCoV-19 group and 127 (1·1%) of 11 962 participants in the control group had serious adverse events. There were seven deaths considered unrelated to vaccination (two in the ChAdOx1 nCov-19 group and five in the control group), including one COVID-19-related death in one participant in the control group. Exploratory analyses showed that vaccine efficacy after a single standard dose of vaccine from day 22 to day 90 after vaccination was 76·0% (59·3-85·9). Our modelling analysis indicated that protection did not wane during this initial 3-month period. Similarly, antibody levels were maintained during this period with minimal waning by day 90 (geometric mean ratio [GMR] 0·66 [95% CI 0·59-0·74]). In the participants who received two standard doses, after the second dose, efficacy was higher in those with a longer prime-boost interval (vaccine efficacy 81·3% [95% CI 60·3-91·2] at ≥12 weeks) than in those with a short interval (vaccine efficacy 55·1% [33·0-69·9] at <6 weeks). These observations are supported by immunogenicity data that showed binding antibody responses more than two-fold higher after an interval of 12 or more weeks compared with an interval of less than 6 weeks in those who were aged 18-55 years (GMR 2·32 [2·01-2·68]). INTERPRETATION: The results of this primary analysis of two doses of ChAdOx1 nCoV-19 were consistent with those seen in the interim analysis of the trials and confirm that the vaccine is efficacious, with results varying by dose interval in exploratory analyses. A 3-month dose interval might have advantages over a programme with a short dose interval for roll-out of a pandemic vaccine to protect the largest number of individuals in the population as early as possible when supplies are scarce, while also improving protection after receiving a second dose. FUNDING: UK Research and Innovation, National Institutes of Health Research (NIHR), The Coalition for Epidemic Preparedness Innovations, the Bill & Melinda Gates Foundation, the Lemann Foundation, Rede D'Or, the Brava and Telles Foundation, NIHR Oxford Biomedical Research Centre, Thames Valley and South Midland's NIHR Clinical Research Network, and AstraZeneca

    Genomic reconstruction of the SARS-CoV-2 epidemic in England.

    Get PDF
    The evolution of the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) virus leads to new variants that warrant timely epidemiological characterization. Here we use the dense genomic surveillance data generated by the COVID-19 Genomics UK Consortium to reconstruct the dynamics of 71 different lineages in each of 315 English local authorities between September 2020 and June 2021. This analysis reveals a series of subepidemics that peaked in early autumn 2020, followed by a jump in transmissibility of the B.1.1.7/Alpha lineage. The Alpha variant grew when other lineages declined during the second national lockdown and regionally tiered restrictions between November and December 2020. A third more stringent national lockdown suppressed the Alpha variant and eliminated nearly all other lineages in early 2021. Yet a series of variants (most of which contained the spike E484K mutation) defied these trends and persisted at moderately increasing proportions. However, by accounting for sustained introductions, we found that the transmissibility of these variants is unlikely to have exceeded the transmissibility of the Alpha variant. Finally, B.1.617.2/Delta was repeatedly introduced in England and grew rapidly in early summer 2021, constituting approximately 98% of sampled SARS-CoV-2 genomes on 26 June 2021
    corecore