464 research outputs found

    What Drives the Intensification of Mesoscale Convective Systems over the West African Sahel under Climate Change?

    Get PDF
    Extreme rainfall is expected to increase under climate change, carrying potential socioeconomic risks. However, the magnitude of increase is uncertain. Over recent decades, extreme storms over the West African Sahel have increased in frequency, with increased vertical wind shear shown to be a cause. Drier midlevels, stronger cold pools, and increased storm organization have also been observed. Global models do not capture the potential effects of lower- to midtropospheric wind shear or cold pools on storm organization since they parameterize convection. Here we use the first convection-permitting simulations of African climate change to understand how changes in thermodynamics and storm dynamics affect future extreme Sahelian rainfall. The model, which simulates warming associated with representative concentration pathway 8.5 (RCP8.5) until the end of the twenty-first century, projects a 28% increase of the extreme rain rate of MCSs. The Sahel moisture change on average follows Clausius–Clapeyron scaling, but has regional heterogeneity. Rain rates scale with the product of time-of-storm total column water (TCW) and in-storm vertical velocity. Additionally, prestorm wind shear and convective available potential energy both modulate in-storm vertical velocity. Although wind shear affects cloud-top temperatures within our model, it has no direct correlation with precipitation rates. In our model, projected future increase in TCW is the primary explanation for increased rain rates. Finally, although colder cold pools are modeled in the future climate, we see no significant change in near-surface winds, highlighting avenues for future research on convection-permitting modeling of storm dynamics

    Detection of mixed infection from bacterial whole genome sequence data allows assessment of its role in Clostridium difficile transmission

    Get PDF
    Bacterial whole genome sequencing offers the prospect of rapid and high precision investigation of infectious disease outbreaks. Close genetic relationships between microorganisms isolated from different infected cases suggest transmission is a strong possibility, whereas transmission between cases with genetically distinct bacterial isolates can be excluded. However, undetected mixed infections-infection with ≥2 unrelated strains of the same species where only one is sequenced-potentially impairs exclusion of transmission with certainty, and may therefore limit the utility of this technique. We investigated the problem by developing a computationally efficient method for detecting mixed infection without the need for resource-intensive independent sequencing of multiple bacterial colonies. Given the relatively low density of single nucleotide polymorphisms within bacterial sequence data, direct reconstruction of mixed infection haplotypes from current short-read sequence data is not consistently possible. We therefore use a two-step maximum likelihood-based approach, assuming each sample contains up to two infecting strains. We jointly estimate the proportion of the infection arising from the dominant and minor strains, and the sequence divergence between these strains. In cases where mixed infection is confirmed, the dominant and minor haplotypes are then matched to a database of previously sequenced local isolates. We demonstrate the performance of our algorithm with in silico and in vitro mixed infection experiments, and apply it to transmission of an important healthcare-associated pathogen, Clostridium difficile. Using hospital ward movement data in a previously described stochastic transmission model, 15 pairs of cases enriched for likely transmission events associated with mixed infection were selected. Our method identified four previously undetected mixed infections, and a previously undetected transmission event, but no direct transmission between the pairs of cases under investigation. These results demonstrate that mixed infections can be detected without additional sequencing effort, and this will be important in assessing the extent of cryptic transmission in our hospitals

    Effects on early monsoon rainfall in West Africa due to recent deforestation in a convection-permitting ensemble

    Get PDF
    Tropical deforestation can have a significant effect on climate, but research attention has been directed mostly towards Amazonian deforestation. The southern part of West Africa (a region dependent on rain-fed agriculture and vulnerable to droughts and flooding) has seen significant deforestation since the 1950s. Many previous tropical deforestation studies have used idealized and exaggerated deforestation scenarios and parameterized convection models. In this study we estimate for the first time realistic historical deforestation from the Land-Use Harmonization dataset in West Africa and simulate the impacts in a 5 d ensemble forecast in June using a convection-permitting regional climate model. We find that sensible heat flux increases at the expense of latent heat flux in most deforested regions, and rainfall increases by an average of 8.4 % over deforested pixels from 18:00–06:00 UTC, whereas changes are much less pronounced during the day. Over large areas of deforestation approx. 300 km inland (e.g. west Guinea) the roughness-length and thermally enhanced convergence during the afternoon and evening occurs over the deforested areas resulting in increases in rainfall with little impact from reduced daytime humidity. In areas of coastal deforestation (e.g. Côte d'Ivoire), increased winds drive the sea breeze convection inland, resulting in evening rainfall reductions over the deforested area but increases further inland, in line with observations. We suggest our results would not be replicated in parameterized convection models, which are known to struggle with capturing peak convective activity in the late afternoon and long-lived nocturnal rainfall and with reproducing observed surface–rainfall feedbacks

    Multicohort cross-sectional study of cognitive and behavioural digital biomarkers in neurodegeneration: the Living Lab Study protocol

    Get PDF
    INTRODUCTION AND AIMS: Digital biomarkers can provide a cost-effective, objective and robust measure for neurological disease progression, changes in care needs and the effect of interventions. Motor function, physiology and behaviour can provide informative measures of neurological conditions and neurodegenerative decline. New digital technologies present an opportunity to provide remote, high-frequency monitoring of patients from within their homes. The purpose of the living lab study is to develop novel digital biomarkers of functional impairment in those living with neurodegenerative disease (NDD) and neurological conditions. METHODS AND ANALYSIS: The Living Lab study is a cross-sectional observational study of cognition and behaviour in people living with NDDs and other, non-degenerative neurological conditions. Patients (n≥25 for each patient group) with dementia, Parkinson's disease, amyotrophic lateral sclerosis, mild cognitive impairment, traumatic brain injury and stroke along with controls (n≥60) will be pragmatically recruited. Patients will carry out activities of daily living and functional assessments within the Living Lab. The Living Lab is an apartment-laboratory containing a functional kitchen, bathroom, bed and living area to provide a controlled environment to develop novel digital biomarkers. The Living Lab provides an important intermediary stage between the conventional laboratory and the home. Multiple passive environmental sensors, internet-enabled medical devices, wearables and electroencephalography (EEG) will be used to characterise functional impairments of NDDs and non-NDD conditions. We will also relate these digital technology measures to clinical and cognitive outcomes. ETHICS AND DISSEMINATION: Ethical approvals have been granted by the Imperial College Research Ethics Committee (reference number: 21IC6992). Results from the study will be disseminated at conferences and within peer-reviewed journals

    Application of machine learning techniques to tuberculosis drug resistance analysis

    Get PDF
    MOTIVATION: Timely identification of Mycobacterium tuberculosis (MTB) resistance to existing drugs is vital to decrease mortality and prevent the amplification of existing antibiotic resistance. Machine learning methods have been widely applied for timely predicting resistance of MTB given a specific drug and identifying resistance markers. However, they have been not validated on a large cohort of MTB samples from multi-centers across the world in terms of resistance prediction and resistance marker identification. Several machine learning classifiers and linear dimension reduction techniques were developed and compared for a cohort of 13 402 isolates collected from 16 countries across 6 continents and tested 11 drugs. RESULTS: Compared to conventional molecular diagnostic test, area under curve of the best machine learning classifier increased for all drugs especially by 23.11%, 15.22% and 10.14% for pyrazinamide, ciprofloxacin and ofloxacin, respectively (P < 0.01). Logistic regression and gradient tree boosting found to perform better than other techniques. Moreover, logistic regression/gradient tree boosting with a sparse principal component analysis/non-negative matrix factorization step compared with the classifier alone enhanced the best performance in terms of F1-score by 12.54%, 4.61%, 7.45% and 9.58% for amikacin, moxifloxacin, ofloxacin and capreomycin, respectively, as well increasing area under curve for amikacin and capreomycin. Results provided a comprehensive comparison of various techniques and confirmed the application of machine learning for better prediction of the large diverse tuberculosis data. Furthermore, mutation ranking showed the possibility of finding new resistance/susceptible markers. AVAILABILITY AND IMPLEMENTATION: The source code can be found at http://www.robots.ox.ac.uk/ davidc/code.php. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online

    Application of machine learning techniques to tuberculosis drug resistance analysis

    Get PDF
    Timely identification of Mycobacterium tuberculosis (MTB) resistance to existing drugs is vital to decrease mortality and prevent the amplification of existing antibiotic resistance. Machine learning methods have been widely applied for timely predicting resistance of MTB given a specific drug and identifying resistance markers. However, they have been not validated on a large cohort of MTB samples from multi-centers across the world in terms of resistance prediction and resistance marker identification. Several machine learning classifiers and linear dimension reduction techniques were developed and compared for a cohort of 13 402 isolates collected from 16 countries across 6 continents and tested 11 drugs. Results Compared to conventional molecular diagnostic test, area under curve of the best machine learning classifier increased for all drugs especially by 23.11%, 15.22% and 10.14% for pyrazinamide, ciprofloxacin and ofloxacin, respectively (P &lt; 0.01). Logistic regression and gradient tree boosting found to perform better than other techniques. Moreover, logistic regression/gradient tree boosting with a sparse principal component analysis/non-negative matrix factorization step compared with the classifier alone enhanced the best performance in terms of F1-score by 12.54%, 4.61%, 7.45% and 9.58% for amikacin, moxifloxacin, ofloxacin and capreomycin, respectively, as well increasing area under curve for amikacin and capreomycin. Results provided a comprehensive comparison of various techniques and confirmed the application of machine learning for better prediction of the large diverse tuberculosis data. Furthermore, mutation ranking showed the possibility of finding new resistance/susceptible markers. Availability and implementation The source code can be found at http://www.robots.ox.ac.uk/ davidc/code.php Supplementary information Supplementary data are available at Bioinformatics online. </jats:sec

    Machine learning for classifying tuberculosis drug-resistance from DNA sequencing data

    Get PDF
    Motivation: Correct and rapid determination of Mycobacterium tuberculosis (MTB) resistance against available tuberculosis (TB) drugs is essential for the control and management of TB. Conventional molecular diagnostic test assumes that the presence of any well-studied single nucleotide polymorphisms is sufficient to cause resistance, which yields low sensitivity for resistance classification. Methods: Given the availability of DNA sequencing data from MTB, we developed machine learning models for a cohort of 1839 UK bacterial isolates to classify MTB resistance against eight anti-TB drugs (isoniazid, rifampicin, ethambutol, pyrazinamide, ciprofloxacin, moxifloxacin, ofloxacin, streptomycin) and to classify multi-drug resistance. Results: Compared to previous rules-based approach, the sensitivities from the best-performing models increased by 2-4% for isoniazid, rifampicin and ethambutol to 97% (p<0.01), respectively; for ciprofloxacin and multi-drug resistant TB, they increased to 96%. For moxifloxacin and ofloxacin, sensitivities increased by 12% and 15% from 83% and 81% based on existing known resistance alleles to 95% and 96% (p<0.01), respectively. Particularly, our models improved sensitivities compared to the previous rules-based approach by 15% and 24% to 84% and 87% for pyrazinamide and streptomycin (p<0.01), respectively. The best-performing models increase the area-under-the-ROC curve by 10% for pyrazinamide and streptomycin (p<0.01), and 4-8% for other drugs (p<0.01). Availability: The details of source code are provided at http://www.robots.ox.ac.uk/davidc/code.ph

    Antimicrobial resistance determinants are associated with Staphylococcus aureus bacteraemia and adaptation to the healthcare environment: a bacterial genome-wide association study

    Get PDF
    Staphylococcus aureus is a major bacterial pathogen in humans, and a dominant cause of severe bloodstream infections. Globally, antimicrobial resistance (AMR) in S. aureus remains challenging. While human risk factors for infection have been defined, contradictory evidence exists for the role of bacterial genomic variation in S. aureus disease. To investigate the contribution of bacterial lineage and genomic variation to the development of bloodstream infection, we undertook a genome-wide association study comparing bacteria from 1017 individuals with bacteraemia to 984 adults with asymptomatic S. aureus nasal carriage. Within 984 carriage isolates, we also compared healthcare-associated (HA) carriage with community-associated (CA) carriage. All major global lineages were represented in both bacteraemia and carriage, with no evidence for different infection rates. However, kmers tagging trimethoprim resistance-conferring mutation F99Y in dfrB were significantly associated with bacteraemia-vs-carriage (P=10-8.9-10-9.3). Pooling variation within genes, bacteraemia-vs-carriage was associated with the presence of mecA (HMP=10-5.3) as well as the presence of SCCmec (HMP=10-4.4). Among S. aureus carriers, no lineages were associated with HA-vs-CA carriage. However, we found a novel signal of HA-vs-CA carriage in the foldase protein prsA, where kmers representing conserved sequence allele were associated with CA carriage (P=10-7.1-10-19.4), while in gyrA, a ciprofloxacin resistance-conferring mutation, L84S, was associated with HA carriage (P=10-7.2). In an extensive study of S. aureus bacteraemia and nasal carriage in the UK, we found strong evidence that all S. aureus lineages are equally capable of causing bloodstream infection, and of being carried in the healthcare environment. Genomic variation in the foldase protein prsA is a novel genomic marker of healthcare origin in S. aureus but was not associated with bacteraemia. AMR determinants were associated with both bacteraemia and healthcare-associated carriage, suggesting that AMR increases the propensity not only to survive in healthcare environments, but also to cause invasive disease

    Erytrocyte membrane anionic charge in type 2 diabetic patients with retinopathy

    Get PDF
    BACKGROUND: The Steno hypothesis states that changes in basement membrane anionic charge leads to diabetic microvascular complications. In diabetic nephropathy, loss of basement membrane glycosaminoglycans and the association between glomerular basement membrane heparan sulphate and proteinuria has been documented. A correlation between erythrocyte surface and the glomerular capillary wall charges has also been observed. The aim of this study is to evaluate the relationship between retinopathy and erythrocyte anionic charge and urinary glycosaminoglycan excretion in type 2 diabetic patients. METHODS: 49 subjects (58 ± 7 yrs, M/F 27/22) with type 2 diabetes with proliferative retinopathy (n = 13), nonproliferative retinopathy (n = 13) and without retinopathy (n = 23) were included in the study. 38 healthy subjects were selected as control group (57 ± 5 yrs, M/F 19/19). Erythrocyte anionic charge (EAC) was determined by the binding of the cationic dye, alcian blue. Urinary glycosaminoglycan and microalbumin excretion were measured. RESULTS: EAC was significantly decreased in diabetic patients with retinopathy (255 ± 30 ng alcian blue/10(6 )RBC, 312 ± 30 ng alcian blue/10(6 )RBC for diabetic and control groups respectively, p < 0.001). We did not observe an association between urinary GAG and microalbumin excretion and diabetic retinopathy. EAC is found to be negatively corralated with microalbuminuria in all groups. CONCLUSIONS: We conclude that type 2 diabetic patients with low erythrocyte anionic charge are associated with diabetic retinopathy. Reduction of negative charge of basement membranes may indicate general changes in microvasculature rather than retinopathy. More prospective and large studies needs to clarify the role of glycosaminoglycans on progression of retinopathy in type 2 diabetic patients
    • …
    corecore