12 research outputs found

    The recent history of an insular bat population reveals an environmental disequilibrium and conservation concerns

    Get PDF
    With the global pandemic of Covid-19, the putative threats related to the increasing contact between wild animals, including bats, and human populations have been highlighted. Bats are indeed known to carry several zoonoses, but at the same time, many species are currently facing the risk of extinction. In this context, being able to monitor the evolution of bat populations in the long term and predict future potential contact with humans has important implications for conservation and public health. In this study, we attempt to demonstrate the usefulness of a small-scale paleobiological approach to track the evolution of an insular population of Antillean fruit-eating bats (Brachyphylla cavernarum), known to carry zoonoses, by documenting the temporal evolution of a cave roosting site and its approximately 250 000 individuals bat colony. To do so, we conducted a stratigraphic analysis of the sedimentary infilling of the cave, as well as a taphonomic and paleobiological analysis of the bone contents of the sediment. Additionally, we performed a neotaphonomic study of an assemblage of scats produced by cats that had consumed bats on-site. Our results reveal the effects of human-induced environmental disturbances, as well as conservation policies, on the bat colony. They also demonstrate that the roosting site is currently filling at a very fast pace, which may lead to the displacement of the bat colony and increased contact between bats and human populations in the near future. Our research outcomes advocate for a better consideration of retrospective paleobiological data to address conservation questions related to bat populations

    Evolution within a given virulence phenotype (pathotype) is driven by changes in aggressiveness: a case study of French wheat leaf rust populations

    Get PDF
    Plant pathogens are constantly evolving and adapting to their environment, including their host. Virulence alleles emerge, and then increase, and sometimes decrease in frequency within pathogen populations in response to the fluctuating selection pressures imposed by the deployment of resistance genes. In some cases, these strong selection pressures cannot fully explain the evolution observed in pathogen populations. A previous study on the French population of Puccinia triticina, the causal agent of wheat leaf rust, showed that two major pathotypes — groups of isolates with a particular combination of virulences — predominated but then declined over the 2005-2016 period. The relative dynamics and the domination of these two pathotypes — 166 317 0 and 106 314 0 —, relative to the other pathotypes present in the population at a low frequency although compatible, i.e. virulent on several varieties deployed, could not be explained solely by the frequency of Lr genes in the landscape. Within these two pathotypes, we identified two main genotypes that emerged in succession. We assessed three components of aggressiveness — infection efficiency, latency period and sporulation capacity — for 44 isolates representative of the four P. triticina pathotype-genotype combinations. We showed, for both pathotypes, that the more recent genotypes were more aggressive than the older ones. Our findings were highly consistent for the various components of aggressiveness for pathotype 166 317 0 grown on Michigan Amber — a ‘naive’ cultivar never grown in the landscape — or on Apache — a ‘neutral’ cultivar, which does not affect the pathotype frequency in the landscape and therefore was postulated to have no or minor selection effect on the population composition. For pathotype 106 314 0, the most recent genotype had a shorter latency period on several of the cultivars most frequently grown in the landscape, but not on ‘neutral’ and ‘naive’ cultivars. We conclude that the quantitative components of aggressiveness can be significant drivers of evolution in pathogen populations. A gain in aggressiveness stopped the decline in frequency of a pathotype, and subsequently allowed an increase in frequency of this pathotype in the pathogen population, providing evidence that adaptation to a changing varietal landscape not only affects virulence but can also lead to changes in aggressiveness

    The Effect of Time and Method of Storage on the Chemical Composition, Pepsin-Cellulase Digestibility, and Near-Infrared Spectra of Whole-Maize Forage

    No full text
    This study examined the effects of long-term storage conditions on the chemical composition, pepsin-cellulase dry matter digestibility (PCDMD), and visible (VIS)/near infrared spectra (NIR) of forage. Eighteen samples of different whole-crop maize varieties originally harvested in 1987 were used. After drying, these samples were analyzed in the laboratory for ash, crude protein (CP), structural carbohydrates, total soluble carbohydrates (TSC), starch and PCDMD, and the remaining samples were stored frozen (at −20°C) or at barn temperature (ambient temperatures ranged from −8.5 °C to 27.1 °C). In 2016, the samples were analyzed for ash, CP, structural carbohydrates, TSC, starch and PCDMD. The visible/NIR spectra of both storage methods were obtained. Chemical composition and PCDMD analyses revealed significant differences (p < 0.05) between the storage methods for TSC but not for the other parameters (p > 0.05). After sample harvesting in 1987, the analyses were compared with those in 2016. It was found that the post-harvest TSC and ash content were higher (p < 0.05) and lower (p < 0.05), respectively, during 2016. No significant differences were found for starch and PCDMD. Important differences between the VIS/NIR spectra of both storage methods were obtained in the VIS segment, particularly in the area between 630 and 760 nm. We concluded that storing dry forage samples at ambient temperature for a very long time (29 years) did not change their nutritive value compared to the values obtained before storage

    The effect of time and method of storage on the chemical composition, pepsin-cellulase digestibility, and near-infrared spectra of whole-maize forage

    No full text
    This study examined the effects of long-term storage conditions on the chemical composition, pepsin-cellulase dry matter digestibility (PCDMD), and visible (VIS)/near infrared spectra (NIR) of forage. Eighteen samples of different whole-crop maize varieties originally harvested in 1987 were used. After drying, these samples were analyzed in the laboratory for ash, crude protein (CP), structural carbohydrates, total soluble carbohydrates (TSC), starch and PCDMD, and the remaining samples were stored frozen (at −20°C) or at barn temperature (ambient temperatures ranged from −8.5 °C to 27.1 °C). In 2016, the samples were analyzed for ash, CP, structural carbohydrates, TSC, starch and PCDMD. The visible/NIR spectra of both storage methods were obtained. Chemical composition and PCDMD analyses revealed significant differences (p 0.05). After sample harvesting in 1987, the analyses were compared with those in 2016. It was found that the post-harvest TSC and ash content were higher (p < 0.05) and lower (p < 0.05), respectively, during 2016. No significant differences were found for starch and PCDMD. Important differences between the VIS/NIR spectra of both storage methods were obtained in the VIS segment, particularly in the area between 630 and 760 nm. We concluded that storing dry forage samples at ambient temperature for a very long time (29 years) did not change their nutritive value compared to the values obtained before storage

    Increased dosage of mammalian Sir2 in pancreatic beta cells enhances glucose-stimulated insulin secretion in mice

    No full text
    Sir2 NAD-dependent deacetylases connect transcription, metabolism, and aging. Increasing the dosage or activity of Sir2 extends life span in yeast, worms, and flies and promotes fat mobilization and glucose production in mammalian cells. Here we show that increased dosage of Sirt1, the mammalian Sir2 ortholog, in pancreatic beta cells improves glucose tolerance and enhances insulin secretion in response to glucose in beta cell-specific Sirt1-overexpressing (BESTO) transgenic mice. This phenotype is maintained as BESTO mice age. Pancreatic perfusion experiments further demonstrate that Sirt1 enhances insulin secretion in response to glucose and KCl. Microarray analyses of beta cell lines reveal that Sirt1 regulates genes involved in insulin secretion, including uncoupling protein 2 (Ucp2). Isolated BESTO islets also have reduced Ucp2, increased ATP production, and enhanced insulin secretion during glucose and KCl stimulation. These findings establish the importance of Sirt1 in beta cell function in vivo and suggest therapeutic interventions for type 2 diabetes

    Allogeneic transplantation in advanced cutaneous T-cell lymphomas (CUTALLO): a propensity score matched controlled prospective study

    No full text
    International audienc

    Healthcare-associated infections in patients with severe COVID-19 supported with extracorporeal membrane oxygenation: a nationwide cohort study

    No full text
    International audienceBackground Both critically ill patients with coronavirus disease 2019 (COVID-19) and patients receiving extracorporeal membrane oxygenation (ECMO) support exhibit a high incidence of healthcare-associated infections (HAI). However, data on incidence, microbiology, resistance patterns, and the impact of HAI on outcomes in patients receiving ECMO for severe COVID-19 remain limited. We aimed to report HAI incidence and microbiology in patients receiving ECMO for severe COVID-19 and to evaluate the impact of ECMO-associated infections (ECMO-AI) on in-hospital mortality. Methods For this study, we analyzed data from 701 patients included in the ECMOSARS registry which included COVID-19 patients supported by ECMO in France. Results Among 602 analyzed patients for whom HAI and hospital mortality data were available, 214 (36%) had ECMO-AI, resulting in an incidence rate of 27 ECMO-AI per 1000 ECMO days at risk. Of these, 154 patients had bloodstream infection (BSI) and 117 patients had ventilator-associated pneumonia (VAP). The responsible microorganisms were Enterobacteriaceae (34% for BSI and 48% for VAP), Enterococcus species (25% and 6%, respectively) and non-fermenting Gram-negative bacilli (13% and 20%, respectively). Fungal infections were also observed (10% for BSI and 3% for VAP), as were multidrug-resistant organisms (21% and 15%, respectively). Using a Cox multistate model, ECMO-AI were not found associated with hospital death (HR = 1.00 95% CI [0.79–1.26], p = 0.986). Conclusions In a nationwide cohort of COVID-19 patients receiving ECMO support, we observed a high incidence of ECMO-AI. ECMO-AI were not found associated with hospital death. Trial registration number NCT04397588 (May 21, 2020)
    corecore