46 research outputs found
Galaxy Zoo:Probabilistic Morphology through Bayesian CNNs and Active Learning
We use Bayesian convolutional neural networks and a novel generative model of Galaxy Zoo volunteer responses to infer posteriors for the visual morphology of galaxies. Bayesian CNN can learn from galaxy images with uncertain labels and then, for previously unlabelled galaxies, predict the probability of each possible label. Our posteriors are well-calibrated (e.g. for predicting bars, we achieve coverage errors of 10.6% within 5 responses and 2.9% within 10 responses) and hence are reliable for practical use. Further, using our posteriors, we apply the active learning strategy BALD to request volunteer responses for the subset of galaxies which, if labelled, would be most informative for training our network. We show that training our Bayesian CNNs using active learning requires up to 35-60% fewer labelled galaxies, depending on the morphological feature being classified. By combining human and machine intelligence, Galaxy Zoo will be able to classify surveys of any conceivable scale on a timescale of weeks, providing massive and detailed morphology catalogues to support research into galaxy evolution...
Establishing effective conservation management strategies for a poorly known endangered species: A case study using Australia’s night parrot (Pezoporus occidentalis)
An evidence-based approach to the conservation management of a species requires knowledge of that species’ status, distribution, ecology, and threats. Coupled with budgets for specific conservation strategies, this knowledge allows prioritisation of funding toward activities that maximise benefit for the species. However, many threatened species are poorly known, and determining which conservation strategies will achieve this is difficult. Such cases require approaches that allow decision-making under uncertainty. Here we used structured expert elicitation to estimate the likely benefit of potential management strategies for the Critically Endangered and, until recently, poorly known Night Parrot (Pezoporus occidentalis). Experts considered cat management the single most effective management strategy for the Night Parrot. However, a combination of protecting and actively managing existing intact Night Parrot habitat through management of grazing, controlling feral cats, and managing fire specifically to maintain Night Parrot habitat was thought to result in the greatest conservation gains. The most cost-effective strategies were thought to be fire management to maintain Night Parrot habitat, and intensive cat management using control methods that exploit local knowledge of cat movements and ecology. Protecting and restoring potentially suitable, but degraded, Night Parrot habitat was considered the least effective and least cost-effective strategy. These expert judgements provide an informed starting point for land managers implementing on-ground programs targeting the Night Parrot, and those developing policy aimed at the species’ longer-term conservation. As a set of hypotheses, they should be implemented, assessed, and improved within an adaptive management framework that also considers the likely co-benefits of these strategies for other species and ecosystems. The broader methodology is applicable to conservation planning for the management and conservation of other poorly known threatened species
A new approach to treatment of resistant gram-positive infections: potential impact of targeted IV to oral switch on length of stay
BACKGROUND: Patients prescribed intravenous (IV) glycopeptides usually remain in hospital until completion of this treatment. Some of these patients could be discharged earlier if a switch to an oral antibiotic was made. This study was designed to identify the percentage of inpatients currently prescribed IV glycopeptides who could be discharged earlier if a switch to an oral agent was used, and to estimate the number of bed days that could be saved. We also aimed to identify the patient group(s) most likely to benefit, and to estimate the number of days of IV therapy that could be prevented in patients who remained in hospital. METHODS: Patients were included if they were prescribed an IV glycopeptide for 5 days or more. Predetermined IV to oral antibiotic switch criteria and discharge criteria were applied. A multiple logistic regression model was used to identify the characteristics of the patients most likely to be suitable for earlier discharge. RESULTS: Of 211 patients, 62 (29%) could have had a reduced length of stay if they were treated with a suitable oral antibiotic. This would have saved a total of 649 inpatient days (median 5 per patient; range 1–54). A further 31 patients (15%) could have switched to oral therapy as an inpatient thus avoiding IV line use. The patients most likely to be suitable for early discharge were those with skin and soft tissue infection, under the cardiology, cardiothoracic surgery, orthopaedics, general medical, plastic surgery and vascular specialities, with no high risk comorbidity and less than five other regularly prescribed drugs. CONCLUSION: The need for glycopeptide therapy has a significant impact on length of stay. Effective targeting of oral antimicrobials could reduce the need for IV access, allow outpatient treatment and thus reduce the length of stay in patients with infections caused by antibiotic resistant gram-positive bacteria
Identification and single-base gene-editing functional validation of a cis-EPO variant as a genetic predictor for EPO-increasing therapies
Hypoxia-inducible factor prolyl hydroxylase inhibitors (HIF-PHIs) are currently under clinical development for treating anemia in chronic kidney disease (CKD), but it is important to monitor their cardiovascular safety. Genetic variants can be used as predictors to help inform the potential risk of adverse effects associated with drug treatments. We therefore aimed to use human genetics to help assess the risk of adverse cardiovascular events associated with therapeutically altered EPO levels to help inform clinical trials studying the safety of HIF-PHIs. By performing a genome-wide association meta-analysis of EPO (n = 6,127), we identified a cis-EPO variant (rs1617640) lying in the EPO promoter region. We validated this variant as most likely causal in controlling EPO levels by using genetic and functional approaches, including single-base gene editing. Using this variant as a partial predictor for therapeutic modulation of EPO and large genome-wide association data in Mendelian randomization tests, we found no evidence (at p < 0.05) that genetically predicted long-term rises in endogenous EPO, equivalent to a 2.2-unit increase, increased risk of coronary artery disease (CAD, OR [95% CI] = 1.01 [0.93, 1.07]), myocardial infarction (MI, OR [95% CI] = 0.99 [0.87, 1.15]), or stroke (OR [95% CI] = 0.97 [0.87, 1.07]). We could exclude increased odds of 1.15 for cardiovascular disease for a 2.2-unit EPO increase. A combination of genetic and functional studies provides a powerful approach to investigate the potential therapeutic profile of EPO-increasing therapies for treating anemia in CKD
Clamp loader ATPases and the evolution of DNA replication machinery
Clamp loaders are pentameric ATPases of the AAA+ family that operate to ensure processive DNA replication. They do so by loading onto DNA the ring-shaped sliding clamps that tether the polymerase to the DNA. Structural and biochemical analysis of clamp loaders has shown how, despite differences in composition across different branches of life, all clamp loaders undergo the same concerted conformational transformations, which generate a binding surface for the open clamp and an internal spiral chamber into which the DNA at the replication fork can slide, triggering ATP hydrolysis, release of the clamp loader, and closure of the clamp round the DNA. We review here the current understanding of the clamp loader mechanism and discuss the implications of the differences between clamp loaders from the different branches of life
Recommended from our members
Effect of Hydrocortisone on Mortality and Organ Support in Patients With Severe COVID-19: The REMAP-CAP COVID-19 Corticosteroid Domain Randomized Clinical Trial.
Importance: Evidence regarding corticosteroid use for severe coronavirus disease 2019 (COVID-19) is limited. Objective: To determine whether hydrocortisone improves outcome for patients with severe COVID-19. Design, Setting, and Participants: An ongoing adaptive platform trial testing multiple interventions within multiple therapeutic domains, for example, antiviral agents, corticosteroids, or immunoglobulin. Between March 9 and June 17, 2020, 614 adult patients with suspected or confirmed COVID-19 were enrolled and randomized within at least 1 domain following admission to an intensive care unit (ICU) for respiratory or cardiovascular organ support at 121 sites in 8 countries. Of these, 403 were randomized to open-label interventions within the corticosteroid domain. The domain was halted after results from another trial were released. Follow-up ended August 12, 2020. Interventions: The corticosteroid domain randomized participants to a fixed 7-day course of intravenous hydrocortisone (50 mg or 100 mg every 6 hours) (n = 143), a shock-dependent course (50 mg every 6 hours when shock was clinically evident) (n = 152), or no hydrocortisone (n = 108). Main Outcomes and Measures: The primary end point was organ support-free days (days alive and free of ICU-based respiratory or cardiovascular support) within 21 days, where patients who died were assigned -1 day. The primary analysis was a bayesian cumulative logistic model that included all patients enrolled with severe COVID-19, adjusting for age, sex, site, region, time, assignment to interventions within other domains, and domain and intervention eligibility. Superiority was defined as the posterior probability of an odds ratio greater than 1 (threshold for trial conclusion of superiority >99%). Results: After excluding 19 participants who withdrew consent, there were 384 patients (mean age, 60 years; 29% female) randomized to the fixed-dose (n = 137), shock-dependent (n = 146), and no (n = 101) hydrocortisone groups; 379 (99%) completed the study and were included in the analysis. The mean age for the 3 groups ranged between 59.5 and 60.4 years; most patients were male (range, 70.6%-71.5%); mean body mass index ranged between 29.7 and 30.9; and patients receiving mechanical ventilation ranged between 50.0% and 63.5%. For the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively, the median organ support-free days were 0 (IQR, -1 to 15), 0 (IQR, -1 to 13), and 0 (-1 to 11) days (composed of 30%, 26%, and 33% mortality rates and 11.5, 9.5, and 6 median organ support-free days among survivors). The median adjusted odds ratio and bayesian probability of superiority were 1.43 (95% credible interval, 0.91-2.27) and 93% for fixed-dose hydrocortisone, respectively, and were 1.22 (95% credible interval, 0.76-1.94) and 80% for shock-dependent hydrocortisone compared with no hydrocortisone. Serious adverse events were reported in 4 (3%), 5 (3%), and 1 (1%) patients in the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively. Conclusions and Relevance: Among patients with severe COVID-19, treatment with a 7-day fixed-dose course of hydrocortisone or shock-dependent dosing of hydrocortisone, compared with no hydrocortisone, resulted in 93% and 80% probabilities of superiority with regard to the odds of improvement in organ support-free days within 21 days. However, the trial was stopped early and no treatment strategy met prespecified criteria for statistical superiority, precluding definitive conclusions. Trial Registration: ClinicalTrials.gov Identifier: NCT02735707