21 research outputs found

    Optimising Large Animal Models of Sustained Atrial Fibrillation: Relevance of the Critical Mass Hypothesis

    Get PDF
    From Frontiers via Jisc Publications RouterHistory: collection 2021, received 2021-04-04, accepted 2021-05-24, epub 2021-06-15Publication status: PublishedBackground: Large animal models play an important role in our understanding of the pathophysiology of atrial fibrillation (AF). Our aim was to determine whether prospectively collected baseline variables could predict the development of sustained AF in sheep, thereby reducing the number of animals required in future studies. Our hypothesis was that the relationship between atrial dimensions, refractory periods and conduction velocity (otherwise known as the critical mass hypothesis) could be used for the first time to predict the development of sustained AF. Methods: Healthy adult Welsh mountain sheep underwent a baseline electrophysiology study followed by implantation of a neurostimulator connected via an endocardial pacing lead to the right atrial appendage. The device was programmed to deliver intermittent 50 Hz bursts of 30 s duration over an 8-week period whilst sheep were monitored for AF. Results: Eighteen sheep completed the protocol, of which 28% developed sustained AF. Logistic regression analysis showed only fibrillation number (calculated using the critical mass hypothesis as the left atrial diameter divided by the product of atrial conduction velocity and effective refractory period) was associated with an increased likelihood of developing sustained AF (Ln Odds Ratio 26.1 [95% confidence intervals 0.2–52.0] p = 0.048). A receiver-operator characteristic curve showed this could be used to predict which sheep developed sustained AF (C-statistic 0.82 [95% confidence intervals 0.59–1.04] p = 0.04). Conclusion: The critical mass hypothesis can be used to predict sustained AF in a tachypaced ovine model. These findings can be used to optimise the design of future studies involving large animals

    The population genomic legacy of the second plague pandemic

    Get PDF
    Human populations have been shaped by catastrophes that may have left long-lasting signatures in their genomes. One notable example is the second plague pandemic that entered Europe in ca. 1,347 CE and repeatedly returned for over 300 years, with typical village and town mortality estimated at 10%–40%.1 It is assumed that this high mortality affected the gene pools of these populations. First, local population crashes reduced genetic diversity. Second, a change in frequency is expected for sequence variants that may have affected survival or susceptibility to the etiologic agent (Yersinia pestis).2 Third, mass mortality might alter the local gene pools through its impact on subsequent migration patterns. We explored these factors using the Norwegian city of Trondheim as a model, by sequencing 54 genomes spanning three time periods: (1) prior to the plague striking Trondheim in 1,349 CE, (2) the 17th–19th century, and (3) the present. We find that the pandemic period shaped the gene pool by reducing long distance immigration, in particular from the British Isles, and inducing a bottleneck that reduced genetic diversity. Although we also observe an excess of large FST values at multiple loci in the genome, these are shaped by reference biases introduced by mapping our relatively low genome coverage degraded DNA to the reference genome. This implies that attempts to detect selection using ancient DNA (aDNA) datasets that vary by read length and depth of sequencing coverage may be particularly challenging until methods have been developed to account for the impact of differential reference bias on test statistics.publishedVersio

    The population genomic legacy of the second plague pandemic

    Get PDF
    Human populations have been shaped by catastrophes that may have left long-lasting signatures in their genomes. One notable example is the second plague pandemic that entered Europe in ca. 1,347 CE and repeatedly returned for over 300 years, with typical village and town mortality estimated at 10%-40%.1 It is assumed that this high mortality affected the gene pools of these populations. First, local population crashes reduced genetic diversity. Second, a change in frequency is expected for sequence variants that may have affected survival or susceptibility to the etiologic agent (Yersinia pestis).2 Third, mass mortality might alter the local gene pools through its impact on subsequent migration patterns. We explored these factors using the Norwegian city of Trondheim as a model, by sequencing 54 genomes spanning three time periods: (1) prior to the plague striking Trondheim in 1,349 CE, (2) the 17th-19th century, and (3) the present. We find that the pandemic period shaped the gene pool by reducing long distance immigration, in particular from the British Isles, and inducing a bottleneck that reduced genetic diversity. Although we also observe an excess of large FST values at multiple loci in the genome, these are shaped by reference biases introduced by mapping our relatively low genome coverage degraded DNA to the reference genome. This implies that attempts to detect selection using ancient DNA (aDNA) datasets that vary by read length and depth of sequencing coverage may be particularly challenging until methods have been developed to account for the impact of differential reference bias on test statistics

    Exercise and diabetes: relevance and causes for response variability

    Get PDF

    Prospective, multicentre study of screening, investigation and management of hyponatraemia after subarachnoid haemorrhage in the UK and Ireland

    Get PDF
    Background: Hyponatraemia often occurs after subarachnoid haemorrhage (SAH). However, its clinical significance and optimal management are uncertain. We audited the screening, investigation and management of hyponatraemia after SAH. Methods: We prospectively identified consecutive patients with spontaneous SAH admitted to neurosurgical units in the United Kingdom or Ireland. We reviewed medical records daily from admission to discharge, 21 days or death and extracted all measurements of serum sodium to identify hyponatraemia (<135 mmol/L). Main outcomes were death/dependency at discharge or 21 days and admission duration >10 days. Associations of hyponatraemia with outcome were assessed using logistic regression with adjustment for predictors of outcome after SAH and admission duration. We assessed hyponatraemia-free survival using multivariable Cox regression. Results: 175/407 (43%) patients admitted to 24 neurosurgical units developed hyponatraemia. 5976 serum sodium measurements were made. Serum osmolality, urine osmolality and urine sodium were measured in 30/166 (18%) hyponatraemic patients with complete data. The most frequently target daily fluid intake was >3 L and this did not differ during hyponatraemic or non-hyponatraemic episodes. 26% (n/N=42/164) patients with hyponatraemia received sodium supplementation. 133 (35%) patients were dead or dependent within the study period and 240 (68%) patients had hospital admission for over 10 days. In the multivariable analyses, hyponatraemia was associated with less dependency (adjusted OR (aOR)=0.35 (95% CI 0.17 to 0.69)) but longer admissions (aOR=3.2 (1.8 to 5.7)). World Federation of Neurosurgical Societies grade I–III, modified Fisher 2–4 and posterior circulation aneurysms were associated with greater hazards of hyponatraemia. Conclusions: In this comprehensive multicentre prospective-adjusted analysis of patients with SAH, hyponatraemia was investigated inconsistently and, for most patients, was not associated with changes in management or clinical outcome. This work establishes a basis for the development of evidence-based SAH-specific guidance for targeted screening, investigation and management of high-risk patients to minimise the impact of hyponatraemia on admission duration and to improve consistency of patient care

    Synapses and Alzheimers’s disease: effect of immunotherapy?

    No full text

    Calcium in the Pathophysiology of Atrial Fibrillation and Heart Failure

    Get PDF
    Atrial fibrillation (AF) is commonly associated with heart failure. A bidirectional relationship exists between the two—AF exacerbates heart failure causing a significant increase in heart failure symptoms, admissions to hospital and cardiovascular death, while pathological remodeling of the atria as a result of heart failure increases the risk of AF. A comprehensive understanding of the pathophysiology of AF is essential if we are to break this vicious circle. In this review, the latest evidence will be presented showing a fundamental role for calcium in both the induction and maintenance of AF. After outlining atrial electrophysiology and calcium handling, the role of calcium-dependent afterdepolarizations and atrial repolarization alternans in triggering AF will be considered. The atrial response to rapid stimulation will be discussed, including the short-term protection from calcium overload in the form of calcium signaling silencing and the eventual progression to diastolic calcium leak causing afterdepolarizations and the development of an electrical substrate that perpetuates AF. The role of calcium in the bidirectional relationship between heart failure and AF will then be covered. The effects of heart failure on atrial calcium handling that promote AF will be reviewed, including effects on both atrial myocytes and the pulmonary veins, before the aspects of AF which exacerbate heart failure are discussed. Finally, the limitations of human and animal studies will be explored allowing contextualization of what are sometimes discordant results
    corecore