65 research outputs found
Recommended from our members
Multiple-stage decisions in a marine central-place forager
Air-breathing marine animals face a complex set of physical challenges associated with diving that affect the decisions of how to optimize feeding. Baleen whales (Mysticeti) have evolved bulk-filter feeding mechanisms to efficiently feed on dense prey patches. Baleen whales are central place foragers where oxygen at the surface represents the central place and depth acts as the distance to prey. Although hypothesized that baleen whales will target the densest prey patches anywhere in the water column, how depth and density interact to influence foraging behaviour is poorly understood. We used multi-sensor archival tags and active acoustics to quantify Antarctic humpback whale foraging behaviour relative to prey. Our analyses reveal multi-stage foraging decisions driven by both krill depth and density. During daylight hours when whales did not feed, krill were found in deep high-density patches. As krill migrated vertically into larger and less dense patches near the surface, whales began to forage. During foraging bouts, we found that feeding rates (number of feeding lunges per hour) were greatest when prey was shallowest, and feeding rates decreased with increasing dive depth. This strategy is consistent with previous models of how air-breathing diving animals optimize foraging efficiency. Thus, humpback whales forage mainly when prey is more broadly distributed and shallower, presumably to minimize diving and searching costs and to increase feeding rates overall and thus foraging efficiency. Using direct measurements of feeding behaviour from animal-borne tags and prey availability from echosounders, our study demonstrates a multi-stage foraging process in a central place forager that we suggest acts to optimize overall efficiency by maximizing net energy gain over time. These data reveal a previously unrecognized level of complexity in predator–prey interactions and underscores the need to simultaneously measure prey distribution in marine central place forager studies.This is the publisher’s final pdf. The published article is copyrighted by The Royal Society and can be found at: http://rsos.royalsocietypublishing.org/Keywords: diving, predator-prey interactions, foraging decision
Henipavirus Neutralising Antibodies in an Isolated Island Population of African Fruit Bats
Isolated islands provide valuable opportunities to study the persistence of viruses in wildlife populations, including population size thresholds such as the critical community size. The straw-coloured fruit bat, Eidolon helvum, has been identified as a reservoir for henipaviruses (serological evidence) and Lagos bat virus (LBV; virus isolation and serological evidence) in continental Africa. Here, we sampled from a remote population of E. helvum annobonensis fruit bats on Annobón island in the Gulf of Guinea to investigate whether antibodies to these viruses also exist in this isolated subspecies. Henipavirus serological analyses (Luminex multiplexed binding and inhibition assays, virus neutralisation tests and western blots) and lyssavirus serological analyses (LBV: modified Fluorescent Antibody Virus Neutralisation test, LBV and Mokola virus: lentivirus pseudovirus neutralisation assay) were undertaken on 73 and 70 samples respectively. Given the isolation of fruit bats on Annobón and their lack of connectivity with other populations, it was expected that the population size on the island would be too small to allow persistence of viruses that are thought to cause acute and immunising infections. However, the presence of antibodies against henipaviruses was detected using the Luminex binding assay and confirmed using alternative assays. Neutralising antibodies to LBV were detected in one bat using both assays. We demonstrate clear evidence for exposure of multiple individuals to henipaviruses in this remote population of E. helvum annobonensis fruit bats on Annobón island. The situation is less clear for LBV. Seroprevalences to henipaviruses and LBV in Annobón are notably different to those in E. helvum in continental locations studied using the same sampling techniques and assays. Whilst cross-sectional serological studies in wildlife populations cannot provide details on viral dynamics within populations, valuable information on the presence or absence of viruses may be obtained and utilised for informing future studies
Genomics for antimicrobial resistance surveillance to support infection prevention and control in health-care facilities
Integration of genomic technologies into routine antimicrobial resistance (AMR) surveillance in health-care facilities has the potential to generate rapid, actionable information for patient management and inform infection prevention and control measures in near real time. However, substantial challenges limit the implementation of genomics for AMR surveillance in clinical settings. Through a workshop series and online consultation, international experts from across the AMR and pathogen genomics fields convened to review the evidence base underpinning the use of genomics for AMR surveillance in a range of settings. Here, we summarise the identified challenges and potential benefits of genomic AMR surveillance in health-care settings, and outline the recommendations of the working group to realise this potential. These recommendations include the definition of viable and cost-effective use cases for genomic AMR surveillance, strengthening training competencies (particularly in bioinformatics), and building capacity at local, national, and regional levels using hub and spoke models
Quantifying neutralising antibody responses against SARS-CoV-2 in dried blood spots (DBS) and paired sera
The ongoing SARS-CoV-2 pandemic was initially managed by non-pharmaceutical interventions such as diagnostic testing, isolation of positive cases, physical distancing and lockdowns. The advent of vaccines has provided crucial protection against SARS-CoV-2. Neutralising antibody (nAb) responses are a key correlate of protection, and therefore measuring nAb responses is essential for monitoring vaccine efficacy. Fingerstick dried blood spots (DBS) are ideal for use in large-scale sero-surveillance because they are inexpensive, offer the option of self-collection and can be transported and stored at ambient temperatures. Such advantages also make DBS appealing to use in resource-limited settings and in potential future pandemics. In this study, nAb responses in sera, venous blood and fingerstick blood stored on filter paper were measured. Samples were collected from SARS-CoV-2 acutely infected individuals, SARS-CoV-2 convalescent individuals and SARS-CoV-2 vaccinated individuals. Good agreement was observed between the nAb responses measured in eluted DBS and paired sera. Stability of nAb responses was also observed in sera stored on filter paper at room temperature for 28 days. Overall, this study provides support for the use of filter paper as a viable sample collection method to study nAb responses.</p
Global assessment of marine plastic exposure risk for oceanic birds
Plastic pollution is distributed patchily around the world’s oceans. Likewise, marine organisms that are vulnerable to plastic ingestion or entanglement have uneven distributions. Understanding where wildlife encounters plastic is crucial for targeting research and mitigation. Oceanic seabirds, particularly petrels, frequently ingest plastic, are highly threatened, and cover vast distances during foraging and migration. However, the spatial overlap between petrels and plastics is poorly understood. Here we combine marine plastic density estimates with individual movement data for 7137 birds of 77 petrel species to estimate relative exposure risk. We identify high exposure risk areas in the Mediterranean and Black seas, and the northeast Pacific, northwest Pacific, South Atlantic and southwest Indian oceans. Plastic exposure risk varies greatly among species and populations, and between breeding and non-breeding seasons. Exposure risk is disproportionately high for Threatened species. Outside the Mediterranean and Black seas, exposure risk is highest in the high seas and Exclusive Economic Zones (EEZs) of the USA, Japan, and the UK. Birds generally had higher plastic exposure risk outside the EEZ of the country where they breed. We identify conservation and research priorities, and highlight that international collaboration is key to addressing the impacts of marine plastic on wide-ranging species
Genetic mechanisms of critical illness in COVID-19.
Host-mediated lung inflammation is present1, and drives mortality2, in the critical illness caused by coronavirus disease 2019 (COVID-19). Host genetic variants associated with critical illness may identify mechanistic targets for therapeutic development3. Here we report the results of the GenOMICC (Genetics Of Mortality In Critical Care) genome-wide association study in 2,244 critically ill patients with COVID-19 from 208 UK intensive care units. We have identified and replicated the following new genome-wide significant associations: on chromosome 12q24.13 (rs10735079, P = 1.65 × 10-8) in a gene cluster that encodes antiviral restriction enzyme activators (OAS1, OAS2 and OAS3); on chromosome 19p13.2 (rs74956615, P = 2.3 × 10-8) near the gene that encodes tyrosine kinase 2 (TYK2); on chromosome 19p13.3 (rs2109069, P = 3.98 × 10-12) within the gene that encodes dipeptidyl peptidase 9 (DPP9); and on chromosome 21q22.1 (rs2236757, P = 4.99 × 10-8) in the interferon receptor gene IFNAR2. We identified potential targets for repurposing of licensed medications: using Mendelian randomization, we found evidence that low expression of IFNAR2, or high expression of TYK2, are associated with life-threatening disease; and transcriptome-wide association in lung tissue revealed that high expression of the monocyte-macrophage chemotactic receptor CCR2 is associated with severe COVID-19. Our results identify robust genetic signals relating to key host antiviral defence mechanisms and mediators of inflammatory organ damage in COVID-19. Both mechanisms may be amenable to targeted treatment with existing drugs. However, large-scale randomized clinical trials will be essential before any change to clinical practice
Ten-year mortality, disease progression, and treatment-related side effects in men with localised prostate cancer from the ProtecT randomised controlled trial according to treatment received
Background
The ProtecT trial reported intention-to-treat analysis of men with localised prostate cancer randomly allocated to active monitoring (AM), radical prostatectomy, and external beam radiotherapy.
Objective
To report outcomes according to treatment received in men in randomised and treatment choice cohorts.
Design, setting, and participants
This study focuses on secondary care. Men with clinically localised prostate cancer at one of nine UK centres were invited to participate in the treatment trial comparing AM, radical prostatectomy, and radiotherapy.
Intervention
Two cohorts included 1643 men who agreed to be randomised and 997 who declined randomisation and chose treatment.
Outcome measurements and statistical analysis
Analysis was carried out to assess mortality, metastasis and progression and health-related quality of life impacts on urinary, bowel, and sexual function using patient-reported outcome measures. Analysis was based on comparisons between groups defined by treatment received for both randomised and treatment choice cohorts in turn, with pooled estimates of intervention effect obtained using meta-analysis. Differences were estimated with adjustment for known prognostic factors using propensity scores.
Results and limitations
According to treatment received, more men receiving AM died of PCa (AM 1.85%, surgery 0.67%, radiotherapy 0.73%), whilst this difference remained consistent with chance in the randomised cohort (p = 0.08); stronger evidence was found in the exploratory analyses (randomised plus choice cohort) when AM was compared with the combined radical treatment group (p = 0.003). There was also strong evidence that metastasis (AM 5.6%, surgery 2.4%, radiotherapy 2.7%) and disease progression (AM 20.35%, surgery 5.87%, radiotherapy 6.62%) were more common in the AM group. Compared with AM, there were higher risks of sexual dysfunction (95% at 6 mo) and urinary incontinence (55% at 6 mo) after surgery, and of sexual dysfunction (88% at 6 mo) and bowel dysfunction (5% at 6 mo) after radiotherapy. The key limitations are the potential for bias when comparing groups defined by treatment received and changes in the protocol for AM during the lengthy follow-up required in trials of screen-detected PCa.
Conclusions
Analyses according to treatment received showed increased rates of disease-related events and lower rates of patient-reported harms in men managed by AM compared with men managed by radical treatment, and stronger evidence of greater PCa mortality in the AM group.
Patient summary
More than 95 out of every 100 men with low or intermediate risk localised prostate cancer do not die of prostate cancer within 10 yr, irrespective of whether treatment is by means of monitoring, surgery, or radiotherapy. Side effects on sexual and bladder function are better after active monitoring, but the risks of spreading of prostate cancer are more common
Adjunctive rifampicin for Staphylococcus aureus bacteraemia (ARREST): a multicentre, randomised, double-blind, placebo-controlled trial.
BACKGROUND: Staphylococcus aureus bacteraemia is a common cause of severe community-acquired and hospital-acquired infection worldwide. We tested the hypothesis that adjunctive rifampicin would reduce bacteriologically confirmed treatment failure or disease recurrence, or death, by enhancing early S aureus killing, sterilising infected foci and blood faster, and reducing risks of dissemination and metastatic infection. METHODS: In this multicentre, randomised, double-blind, placebo-controlled trial, adults (≥18 years) with S aureus bacteraemia who had received ≤96 h of active antibiotic therapy were recruited from 29 UK hospitals. Patients were randomly assigned (1:1) via a computer-generated sequential randomisation list to receive 2 weeks of adjunctive rifampicin (600 mg or 900 mg per day according to weight, oral or intravenous) versus identical placebo, together with standard antibiotic therapy. Randomisation was stratified by centre. Patients, investigators, and those caring for the patients were masked to group allocation. The primary outcome was time to bacteriologically confirmed treatment failure or disease recurrence, or death (all-cause), from randomisation to 12 weeks, adjudicated by an independent review committee masked to the treatment. Analysis was intention to treat. This trial was registered, number ISRCTN37666216, and is closed to new participants. FINDINGS: Between Dec 10, 2012, and Oct 25, 2016, 758 eligible participants were randomly assigned: 370 to rifampicin and 388 to placebo. 485 (64%) participants had community-acquired S aureus infections, and 132 (17%) had nosocomial S aureus infections. 47 (6%) had meticillin-resistant infections. 301 (40%) participants had an initial deep infection focus. Standard antibiotics were given for 29 (IQR 18-45) days; 619 (82%) participants received flucloxacillin. By week 12, 62 (17%) of participants who received rifampicin versus 71 (18%) who received placebo experienced treatment failure or disease recurrence, or died (absolute risk difference -1·4%, 95% CI -7·0 to 4·3; hazard ratio 0·96, 0·68-1·35, p=0·81). From randomisation to 12 weeks, no evidence of differences in serious (p=0·17) or grade 3-4 (p=0·36) adverse events were observed; however, 63 (17%) participants in the rifampicin group versus 39 (10%) in the placebo group had antibiotic or trial drug-modifying adverse events (p=0·004), and 24 (6%) versus six (2%) had drug interactions (p=0·0005). INTERPRETATION: Adjunctive rifampicin provided no overall benefit over standard antibiotic therapy in adults with S aureus bacteraemia. FUNDING: UK National Institute for Health Research Health Technology Assessment
Procalcitonin Is Not a Reliable Biomarker of Bacterial Coinfection in People With Coronavirus Disease 2019 Undergoing Microbiological Investigation at the Time of Hospital Admission
Abstract Admission procalcitonin measurements and microbiology results were available for 1040 hospitalized adults with coronavirus disease 2019 (from 48 902 included in the International Severe Acute Respiratory and Emerging Infections Consortium World Health Organization Clinical Characterisation Protocol UK study). Although procalcitonin was higher in bacterial coinfection, this was neither clinically significant (median [IQR], 0.33 [0.11–1.70] ng/mL vs 0.24 [0.10–0.90] ng/mL) nor diagnostically useful (area under the receiver operating characteristic curve, 0.56 [95% confidence interval, .51–.60]).</jats:p
Implementation of corticosteroids in treating COVID-19 in the ISARIC WHO Clinical Characterisation Protocol UK:prospective observational cohort study
BACKGROUND: Dexamethasone was the first intervention proven to reduce mortality in patients with COVID-19 being treated in hospital. We aimed to evaluate the adoption of corticosteroids in the treatment of COVID-19 in the UK after the RECOVERY trial publication on June 16, 2020, and to identify discrepancies in care. METHODS: We did an audit of clinical implementation of corticosteroids in a prospective, observational, cohort study in 237 UK acute care hospitals between March 16, 2020, and April 14, 2021, restricted to patients aged 18 years or older with proven or high likelihood of COVID-19, who received supplementary oxygen. The primary outcome was administration of dexamethasone, prednisolone, hydrocortisone, or methylprednisolone. This study is registered with ISRCTN, ISRCTN66726260. FINDINGS: Between June 17, 2020, and April 14, 2021, 47 795 (75·2%) of 63 525 of patients on supplementary oxygen received corticosteroids, higher among patients requiring critical care than in those who received ward care (11 185 [86·6%] of 12 909 vs 36 415 [72·4%] of 50 278). Patients 50 years or older were significantly less likely to receive corticosteroids than those younger than 50 years (adjusted odds ratio 0·79 [95% CI 0·70–0·89], p=0·0001, for 70–79 years; 0·52 [0·46–0·58], p80 years), independent of patient demographics and illness severity. 84 (54·2%) of 155 pregnant women received corticosteroids. Rates of corticosteroid administration increased from 27·5% in the week before June 16, 2020, to 75–80% in January, 2021. INTERPRETATION: Implementation of corticosteroids into clinical practice in the UK for patients with COVID-19 has been successful, but not universal. Patients older than 70 years, independent of illness severity, chronic neurological disease, and dementia, were less likely to receive corticosteroids than those who were younger, as were pregnant women. This could reflect appropriate clinical decision making, but the possibility of inequitable access to life-saving care should be considered. FUNDING: UK National Institute for Health Research and UK Medical Research Council
- …