38 research outputs found

    Enteric bacterial pathogen detection in southern sea otters (Enhydra lutris nereis) is associated with coastal urbanization and freshwater runoff

    Get PDF
    Although protected for nearly a century, California’s sea otters have been slow to recover, in part due to exposure to fecally-associated protozoal pathogens like Toxoplasma gondii and Sarcocystis neurona. However, potential impacts from exposure to fecal bacteria have not been systematically explored. Using selective media, we examined feces from live and dead sea otters from California for specific enteric bacterial pathogens (Campylobacter, Salmonella, Clostridium perfringens, C. difficile and Escherichia coli O157:H7), and pathogens endemic to the marine environment (Vibrio cholerae, V. parahaemolyticus and Plesiomonas shigelloides). We evaluated statistical associations between detection of these pathogens in otter feces and demographic or environmental risk factors for otter exposure, and found that dead otters were more likely to test positive for C. perfringens, Campylobacter and V. parahaemolyticus than were live otters. Otters from more urbanized coastlines and areas with high freshwater runoff (near outflows of rivers or streams) were more likely to test positive for one or more of these bacterial pathogens. Other risk factors for bacterial detection in otters included male gender and fecal samples collected during the rainy season when surface runoff is maximal. Similar risk factors were reported in prior studies of pathogen exposure for California otters and their invertebrate prey, suggesting that land-sea transfer and/or facilitation of pathogen survival in degraded coastal marine habitat may be impacting sea otter recovery. Because otters and humans share many of the same foods, our findings may also have implications for human health

    Presynaptic External Calcium Signaling Involves the Calcium-Sensing Receptor in Neocortical Nerve Terminals

    Get PDF
    Nerve terminal invasion by an axonal spike activates voltage-gated channels, triggering calcium entry, vesicle fusion, and release of neurotransmitter. Ion channels activated at the terminal shape the presynaptic spike and so regulate the magnitude and duration of calcium entry. Consequently characterization of the functional properties of ion channels at nerve terminals is crucial to understand the regulation of transmitter release. Direct recordings from small neocortical nerve terminals have revealed that external [Ca(2+)] ([Ca(2+)](o)) indirectly regulates a non-selective cation channel (NSCC) in neocortical nerve terminals via an unknown [Ca(2+)](o) sensor. Here, we identify the first component in a presynaptic calcium signaling pathway.By combining genetic and pharmacological approaches with direct patch-clamp recordings from small acutely isolated neocortical nerve terminals we identify the extracellular calcium sensor. Our results show that the calcium-sensing receptor (CaSR), a previously identified G-protein coupled receptor that is the mainstay in serum calcium homeostasis, is the extracellular calcium sensor in these acutely dissociated nerve terminals. The NSCC currents from reduced function mutant CaSR mice were less sensitive to changes in [Ca(2+)](o) than wild-type. Calindol, an allosteric CaSR agonist, reduced NSCC currents in direct terminal recordings in a dose-dependent and reversible manner. In contrast, glutamate and GABA did not affect the NSCC currents.Our experiments identify CaSR as the first component in the [Ca(2+)](o) sensor-NSCC signaling pathway in neocortical terminals. Decreases in [Ca(2+)](o) will depress synaptic transmission because of the exquisite sensitivity of transmitter release to [Ca(2+)](o) following its entry via voltage-activated Ca(2+) channels. CaSR may detects such falls in [Ca(2+)](o) and increase action potential duration by increasing NSCC activity, thereby attenuating the impact of decreases in [Ca(2+)](o) on release probability. CaSR is positioned to detect the dynamic changes of [Ca(2+)](o) and provide presynaptic feedback that will alter brain excitability

    Genetic Causes of Cardiomyopathy in Children: First Results From the Pediatric Cardiomyopathy Genes Study

    Get PDF
    Pediatric cardiomyopathy is a genetically heterogeneous disease with substantial morbidity and mortality. Current guidelines recommend genetic testing in children with hypertrophic, dilated, or restrictive cardiomyopathy, but practice variations exist. Robust data on clinical testing practices and diagnostic yield in children are lacking. This study aimed to identify the genetic causes of cardiomyopathy in children and to investigate clinical genetic testing practices. Methods and Results Children with familial or idiopathic cardiomyopathy were enrolled from 14 institutions in North America. Probands underwent exome sequencing. Rare sequence variants in 37 known cardiomyopathy genes were assessed for pathogenicity using consensus clinical interpretation guidelines. Of the 152 enrolled probands, 41% had a family history of cardiomyopathy. Of 81 (53%) who had undergone clinical genetic testing for cardiomyopathy before enrollment, 39 (48%) had a positive result. Genetic testing rates varied from 0% to 97% between sites. A positive family history and hypertrophic cardiomyopathy subtype were associated with increased likelihood of genetic testing (P=0.005 and P=0.03, respectively). A molecular cause was identified in an additional 21% of the 63 children who did not undergo clinical testing, with positive results identified in both familial and idiopathic cases and across all phenotypic subtypes. Conclusions A definitive molecular genetic diagnosis can be made in a substantial proportion of children for whom the cause and heritable nature of their cardiomyopathy was previously unknown. Practice variations in genetic testing are great and should be reduced. Improvements can be made in comprehensive cardiac screening and predictive genetic testing in first-degree relatives. Overall, our results support use of routine genetic testing in cases of both familial and idiopathic cardiomyopathy

    Priority research needs to inform amphibian conservation in the Anthropocene

    Get PDF
    The problem of global amphibian declines has prompted extensive research over the last three decades. Initially, the focus was on identifying and characterizing the extent of the problem, but more recently efforts have shifted to evidence‐based research designed to identify best solutions and to improve conservation outcomes. Despite extensive accumulation of knowledge on amphibian declines, there remain knowledge gaps and disconnects between science and action that hamper our ability to advance conservation efforts. Using input from participants at the ninth World Congress of Herpetology, a U.S. Geological Survey Powell Center symposium, amphibian on‐line forums for discussion, the International Union for Conservation of Nature Assisted Reproductive Technologies and Gamete Biobanking group, and respondents to a survey, we developed a list of 25 priority research questions for amphibian conservation at this stage of the Anthropocene. We identified amphibian conservation research priorities while accounting for expected tradeoffs in geographic scope, costs, and the taxonomic breadth of research needs. We aimed to solicit views from individuals rather than organizations while acknowledging inequities in participation. Emerging research priorities (i.e., those under‐represented in recently published amphibian conservation literature) were identified, and included the effects of climate change, community‐level (rather than single species‐level) drivers of declines, methodological improvements for research and monitoring, genomics, and effects of land‐use change. Improved inclusion of under‐represented members of the amphibian conservation community was also identified as a priority. These research needs represent critical knowledge gaps for amphibian conservation although filling these gaps may not be necessary for many conservation actions

    Para-infectious brain injury in COVID-19 persists at follow-up despite attenuated cytokine and autoantibody responses

    Get PDF
    To understand neurological complications of COVID-19 better both acutely and for recovery, we measured markers of brain injury, inflammatory mediators, and autoantibodies in 203 hospitalised participants; 111 with acute sera (1–11 days post-admission) and 92 convalescent sera (56 with COVID-19-associated neurological diagnoses). Here we show that compared to 60 uninfected controls, tTau, GFAP, NfL, and UCH-L1 are increased with COVID-19 infection at acute timepoints and NfL and GFAP are significantly higher in participants with neurological complications. Inflammatory mediators (IL-6, IL-12p40, HGF, M-CSF, CCL2, and IL-1RA) are associated with both altered consciousness and markers of brain injury. Autoantibodies are more common in COVID-19 than controls and some (including against MYL7, UCH-L1, and GRIN3B) are more frequent with altered consciousness. Additionally, convalescent participants with neurological complications show elevated GFAP and NfL, unrelated to attenuated systemic inflammatory mediators and to autoantibody responses. Overall, neurological complications of COVID-19 are associated with evidence of neuroglial injury in both acute and late disease and these correlate with dysregulated innate and adaptive immune responses acutely

    Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH): a stepped-wedge cluster-randomised trial

    Get PDF
    Background: Emergency abdominal surgery is associated with poor patient outcomes. We studied the effectiveness of a national quality improvement (QI) programme to implement a care pathway to improve survival for these patients. Methods: We did a stepped-wedge cluster-randomised trial of patients aged 40 years or older undergoing emergency open major abdominal surgery. Eligible UK National Health Service (NHS) hospitals (those that had an emergency general surgical service, a substantial volume of emergency abdominal surgery cases, and contributed data to the National Emergency Laparotomy Audit) were organised into 15 geographical clusters and commenced the QI programme in a random order, based on a computer-generated random sequence, over an 85-week period with one geographical cluster commencing the intervention every 5 weeks from the second to the 16th time period. Patients were masked to the study group, but it was not possible to mask hospital staff or investigators. The primary outcome measure was mortality within 90 days of surgery. Analyses were done on an intention-to-treat basis. This study is registered with the ISRCTN registry, number ISRCTN80682973. Findings: Treatment took place between March 3, 2014, and Oct 19, 2015. 22 754 patients were assessed for elegibility. Of 15 873 eligible patients from 93 NHS hospitals, primary outcome data were analysed for 8482 patients in the usual care group and 7374 in the QI group. Eight patients in the usual care group and nine patients in the QI group were not included in the analysis because of missing primary outcome data. The primary outcome of 90-day mortality occurred in 1210 (16%) patients in the QI group compared with 1393 (16%) patients in the usual care group (HR 1·11, 0·96–1·28). Interpretation: No survival benefit was observed from this QI programme to implement a care pathway for patients undergoing emergency abdominal surgery. Future QI programmes should ensure that teams have both the time and resources needed to improve patient care. Funding: National Institute for Health Research Health Services and Delivery Research Programme

    Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH): a stepped-wedge cluster-randomised trial

    Get PDF
    BACKGROUND: Emergency abdominal surgery is associated with poor patient outcomes. We studied the effectiveness of a national quality improvement (QI) programme to implement a care pathway to improve survival for these patients. METHODS: We did a stepped-wedge cluster-randomised trial of patients aged 40 years or older undergoing emergency open major abdominal surgery. Eligible UK National Health Service (NHS) hospitals (those that had an emergency general surgical service, a substantial volume of emergency abdominal surgery cases, and contributed data to the National Emergency Laparotomy Audit) were organised into 15 geographical clusters and commenced the QI programme in a random order, based on a computer-generated random sequence, over an 85-week period with one geographical cluster commencing the intervention every 5 weeks from the second to the 16th time period. Patients were masked to the study group, but it was not possible to mask hospital staff or investigators. The primary outcome measure was mortality within 90 days of surgery. Analyses were done on an intention-to-treat basis. This study is registered with the ISRCTN registry, number ISRCTN80682973. FINDINGS: Treatment took place between March 3, 2014, and Oct 19, 2015. 22 754 patients were assessed for elegibility. Of 15 873 eligible patients from 93 NHS hospitals, primary outcome data were analysed for 8482 patients in the usual care group and 7374 in the QI group. Eight patients in the usual care group and nine patients in the QI group were not included in the analysis because of missing primary outcome data. The primary outcome of 90-day mortality occurred in 1210 (16%) patients in the QI group compared with 1393 (16%) patients in the usual care group (HR 1·11, 0·96-1·28). INTERPRETATION: No survival benefit was observed from this QI programme to implement a care pathway for patients undergoing emergency abdominal surgery. Future QI programmes should ensure that teams have both the time and resources needed to improve patient care. FUNDING: National Institute for Health Research Health Services and Delivery Research Programme
    corecore