81 research outputs found
Spatiotemporal characteristics of atrial fibrillation electrograms: a novel marker for arrhythmia stability and termination
Background: Sequentially mapped complex fractionated atrial electrograms (CFAE) and dominant frequency (DF) sites have been targeted during catheter ablation for atrial fibrillation (AF). However, these strategies have yielded variable success and have not been shown to correlate consistently with AF dynamics. Here, we evaluated whether the spatiotemporal stability of CFAE and DF may be a better marker of AF sustenance and termination.
Methods: Eighteen sheep with 12 weeks of "one-kidney, one-clip" hypertension underwent open-chest studies. A total of 42 self-terminating (28–100 s) and 6 sustained (>15 min) AF episodes were mapped using a custom epicardial plaque and analyzed in 4-s epochs for CFAE, using the NavX CFE-m algorithm, and DF, using a Fast Fourier Transform. The spatiotemporal stability index (STSI) was calculated using the intraclass correlation coefficient of consecutive AF epochs.
Results: A total of 67,733 AF epochs were analyzed. During AF initiation, mean CFE-m and the STSI of CFE-m/DF were similar between sustained and self-terminating episodes, although median DF was higher in sustained AF (p=0.001). During sustained AF, the STSI of CFE-m increased significantly (p=0.02), whereas mean CFE-m (p=0.5), median DF (p=0.07), and the STSI of DF remained unchanged (p=0.5). Prior to AF termination, the STSI of CFE-m was significantly lower (p<0.001), with a physiologically non-significant decrease in median DF (−0.3 Hz, p=0.006) and no significant changes in mean CFE-m (p=0.14) or the STSI of DF (p=0.06).
Conclusions: Spatiotemporal stabilization of CFAE favors AF sustenance and its destabilization heralds AF termination. The STSI of CFE-m is more representative of AF dynamics than are the STSI of DF, sequential mean CFE-m, or median DF
Genomic signatures of population decline in the malaria mosquito Anopheles gambiae
Population genomic features such as nucleotide diversity and linkage disequilibrium are expected to be strongly shaped by changes in population size, and might therefore be useful for monitoring the success of a control campaign. In the Kilifi district of Kenya, there has been a marked decline in the abundance of the malaria vector Anopheles gambiae subsequent to the rollout of insecticide-treated bed nets. To investigate whether this decline left a detectable population genomic signature, simulations were performed to compare the effect of population crashes on nucleotide diversity, Tajima's D, and linkage disequilibrium (as measured by the population recombination parameter ρ). Linkage disequilibrium and ρ were estimated for An. gambiae from Kilifi, and compared them to values for Anopheles arabiensis and Anopheles merus at the same location, and for An. gambiae in a location 200 km from Kilifi. In the first simulations ρ changed more rapidly after a population crash than the other statistics, and therefore is a more sensitive indicator of recent population decline. In the empirical data, linkage disequilibrium extends 100-1000 times further, and ρ is 100-1000 times smaller, for the Kilifi population of An. gambiae than for any of the other populations. There were also significant runs of homozygosity in many of the individual An. gambiae mosquitoes from Kilifi. These results support the hypothesis that the recent decline in An. gambiae was driven by the rollout of bed nets. Measuring population genomic parameters in a small sample of individuals before, during and after vector or pest control may be a valuable method of tracking the effectiveness of interventions
Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors
Background:
Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries.
Methods:
In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants.
Findings:
45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups.
Interpretation:
Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency.
Funding:
NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation
Phencyclidine (PCP)-Induced Disruption in Cognitive Performance is Gender-Specific and Associated with a Reduction in Brain-Derived Neurotrophic Factor (BDNF) in Specific Regions of the Female Rat Brain
Phencyclidine (PCP), used to mimic certain aspects of schizophrenia, induces sexually dimorphic, cognitive deficits in rats. In this study, the effects of sub-chronic PCP on expression of brain-derived neurotrophic factor (BDNF), a neurotrophic factor implicated in the pathogenesis of schizophrenia, have been evaluated in male and female rats. Male and female hooded-Lister rats received vehicle or PCP (n = 8 per group; 2 mg/kg i.p. twice daily for 7 days) and were tested in the attentional set shifting task prior to being sacrificed (6 weeks post-treatment). Levels of BDNF mRNA were measured in specific brain regions using in situ hybridisation. Male rats were less sensitive to PCP-induced deficits in the extra-dimensional shift stage of the attentional set shifting task compared to female rats. Quantitative analysis of brain regions demonstrated reduced BDNF levels in the medial prefrontal cortex (p < 0.05), motor cortex (p < 0.01), orbital cortex (p < 0.01), olfactory bulb (p < 0.05), retrosplenial cortex (p < 0.001), frontal cortex (p < 0.01), parietal cortex (p < 0.01), CA1 (p < 0.05) and polymorphic layer of dentate gyrus (p < 0.05) of the hippocampus and the central (p < 0.01), lateral (p < 0.05) and basolateral (p < 0.05) regions of the amygdaloid nucleus in female PCP-treated rats compared with controls. In contrast, BDNF was significantly reduced only in the orbital cortex and central amygdaloid region of male rats (p < 0.05). Results suggest that blockade of NMDA receptors by sub-chronic PCP administration has a long-lasting down-regulatory effect on BDNF mRNA expression in the female rat brain which may underlie some of the behavioural deficits observed post PCP administration
Rare variant burden analysis within enhancers identifies CAV1 as an ALS risk gene
Amyotrophic lateral sclerosis (ALS) is an incurable neurodegenerative disease. CAV1 and CAV2 organize membrane lipid rafts (MLRs) important for cell signaling and neuronal survival, and overexpression of CAV1 ameliorates ALS phenotypes in vivo. Genome-wide association studies localize a large proportion of ALS risk variants within the non-coding genome, but further characterization has been limited by lack of appropriate tools. By designing and applying a pipeline to identify pathogenic genetic variation within enhancer elements responsible for regulating gene expression, we identify disease-associated variation within CAV1/CAV2 enhancers, which replicate in an independent cohort. Discovered enhancer mutations reduce CAV1/CAV2 expression and disrupt MLRs in patient-derived cells, and CRISPR-Cas9 perturbation proximate to a patient mutation is sufficient to reduce CAV1/CAV2 expression in neurons. Additional enrichment of ALS-associated mutations within CAV1 exons positions CAV1 as an ALS risk gene. We propose CAV1/CAV2 overexpression as a personalized medicine target for ALS.peer-reviewe
Longer-term efficiency and safety of increasing the frequency of whole blood donation (INTERVAL): extension study of a randomised trial of 20 757 blood donors
Background:
The INTERVAL trial showed that, over a 2-year period, inter-donation intervals for whole blood donation can be safely reduced to meet blood shortages. We extended the INTERVAL trial for a further 2 years to evaluate the longer-term risks and benefits of varying inter-donation intervals, and to compare routine versus more intensive reminders to help donors keep appointments.
Methods:
The INTERVAL trial was a parallel group, pragmatic, randomised trial that recruited blood donors aged 18 years or older from 25 static donor centres of NHS Blood and Transplant across England, UK. Here we report on the prespecified analyses after 4 years of follow-up. Participants were whole blood donors who agreed to continue trial participation on their originally allocated inter-donation intervals (men: 12, 10, and 8 weeks; women: 16, 14, and 12 weeks). They were further block-randomised (1:1) to routine versus more intensive reminders using computer-generated random sequences. The prespecified primary outcome was units of blood collected per year analysed in the intention-to-treat population. Secondary outcomes related to safety were quality of life, self-reported symptoms potentially related to donation, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin and other factors. This trial is registered with ISRCTN, number ISRCTN24760606, and has completed.
Findings:
Between Oct 19, 2014, and May 3, 2016, 20 757 of the 38 035 invited blood donors (10 843 [58%] men, 9914 [51%] women) participated in the extension study. 10 378 (50%) were randomly assigned to routine reminders and 10 379 (50%) were randomly assigned to more intensive reminders. Median follow-up was 1·1 years (IQR 0·7–1·3). Compared with routine reminders, more intensive reminders increased blood collection by a mean of 0·11 units per year (95% CI 0·04–0·17; p=0·0003) in men and 0·06 units per year (0·01–0·11; p=0·0094) in women. During the extension study, each week shorter inter-donation interval increased blood collection by a mean of 0·23 units per year (0·21–0·25) in men and 0·14 units per year (0·12–0·15) in women (both p<0·0001). More frequent donation resulted in more deferrals for low haemoglobin (odds ratio per week shorter inter-donation interval 1·19 [95% CI 1·15–1·22] in men and 1·10 [1·06–1·14] in women), and lower mean haemoglobin (difference per week shorter inter-donation interval −0·84 g/L [95% CI −0·99 to −0·70] in men and −0·45 g/L [–0·59 to −0·31] in women) and ferritin concentrations (percentage difference per week shorter inter-donation interval −6·5% [95% CI −7·6 to −5·5] in men and −5·3% [–6·5 to −4·2] in women; all p<0·0001). No differences were observed in quality of life, serious adverse events, or self-reported symptoms (p>0.0001 for tests of linear trend by inter-donation intervals) other than a higher reported frequency of doctor-diagnosed low iron concentrations and prescription of iron supplements in men (p<0·0001).
Interpretation:
During a period of up to 4 years, shorter inter-donation intervals and more intensive reminders resulted in more blood being collected without a detectable effect on donors' mental and physical wellbeing. However, donors had decreased haemoglobin concentrations and more self-reported symptoms compared with the initial 2 years of the trial. Our findings suggest that blood collection services could safely use shorter donation intervals and more intensive reminders to meet shortages, for donors who maintain adequate haemoglobin concentrations and iron stores.
Funding:
NHS Blood and Transplant, UK National Institute for Health Research, UK Medical Research Council, and British Heart Foundation
The burden of unintentional drowning : global, regional and national estimates of mortality from the Global Burden of Disease 2017 Study
Background Drowning is a leading cause of injury-related mortality globally. Unintentional drowning (International Classification of Diseases (ICD) 10 codes W65-74 and ICD9 E910) is one of the 30 mutually exclusive and collectively exhaustive causes of injury-related mortality in the Global Burden of Disease (GBD) study. This study's objective is to describe unintentional drowning using GBD estimates from 1990 to 2017. Methods Unintentional drowning from GBD 2017 was estimated for cause-specific mortality and years of life lost (YLLs), age, sex, country, region, Socio-demographic Index (SDI) quintile, and trends from 1990 to 2017. GBD 2017 used standard GBD methods for estimating mortality from drowning. Results Globally, unintentional drowning mortality decreased by 44.5% between 1990 and 2017, from 531 956 (uncertainty interval (UI): 484 107 to 572 854) to 295 210 (284 493 to 306 187) deaths. Global age-standardised mortality rates decreased 57.4%, from 9.3 (8.5 to 10.0) in 1990 to 4.0 (3.8 to 4.1) per 100 000 per annum in 2017. Unintentional drowning-associated mortality was generally higher in children, males and in low-SDI to middle-SDI countries. China, India, Pakistan and Bangladesh accounted for 51.2% of all drowning deaths in 2017. Oceania was the region with the highest rate of age-standardised YLLs in 2017, with 45 434 (40 850 to 50 539) YLLs per 100 000 across both sexes. Conclusions There has been a decline in global drowning rates. This study shows that the decline was not consistent across countries. The results reinforce the need for continued and improved policy, prevention and research efforts, with a focus on low- and middle-income countries.Peer reviewe
- …