36 research outputs found

    Variant location is a novel risk factor for individuals with arrhythmogenic cardiomyopathy due to a desmoplakin (DSP) truncating variant.

    Get PDF
    BACKGROUND: Truncating variants in desmoplakin (DSPtv) are an important cause of arrhythmogenic cardiomyopathy; however the genetic architecture and genotype-specific risk factors are incompletely understood. We evaluated phenotype, risk factors for ventricular arrhythmias, and underlying genetics of DSPtv cardiomyopathy. METHODS: Individuals with DSPtv and any cardiac phenotype, and their gene-positive family members were included from multiple international centers. Clinical data and family history information were collected. Event-free survival from ventricular arrhythmia was assessed. Variant location was compared between cases and controls, and literature review of reported DSPtv performed. RESULTS: There were 98 probands and 72 family members (mean age at diagnosis 43±8 years, 59% women) with a DSPtv, of which 146 were considered clinically affected. Ventricular arrhythmia (sudden cardiac arrest, sustained ventricular tachycardia, appropriate implantable cardioverter defibrillator therapy) occurred in 56 (33%) individuals. DSPtv location and proband status were independent risk factors for ventricular arrhythmia. Further, gene region was important with variants in cases (cohort n=98; Clinvar n=167) more likely to occur in the regions resulting in nonsense mediated decay of both major DSP isoforms, compared with n=124 genome aggregation database control variants (148 [83.6%] versus 29 [16.4%]; P<0.0001). CONCLUSIONS: In the largest series of individuals with DSPtv, we demonstrate that variant location is a novel risk factor for ventricular arrhythmia, can inform variant interpretation, and provide critical insights to allow for precision-based clinical management

    Rebound Discharge in Deep Cerebellar Nuclear Neurons In Vitro

    Get PDF
    Neurons of the deep cerebellar nuclei (DCN) play a critical role in defining the output of cerebellum in the course of encoding Purkinje cell inhibitory inputs. The earliest work performed with in vitro preparations established that DCN cells have the capacity to translate membrane hyperpolarizations into a rebound increase in firing frequency. The primary means of distinguishing between DCN neurons has been according to cell size and transmitter phenotype, but in some cases, differences in the firing properties of DCN cells maintained in vitro have been reported. In particular, it was shown that large diameter cells in the rat DCN exhibit two phenotypes of rebound discharge in vitro that may eventually help define their functional roles in cerebellar output. A transient burst and weak burst phenotype can be distinguished based on the frequency and pattern of rebound discharge immediately following a hyperpolarizing stimulus. Work to date indicates that the difference in excitability arises from at least the degree of activation of T-type Ca2+ current during the immediate phase of rebound firing and Ca2+-dependent K+ channels that underlie afterhyperpolarizations. Both phenotypes can be detected following stimulation of Purkinje cell inhibitory inputs under conditions that preserve resting membrane potential and natural ionic gradients. In this paper, we review the evidence supporting the existence of different rebound phenotypes in DCN cells and the ion channel expression patterns that underlie their generation

    Longitudinal changes in telomere length and associated genetic parameters in dairy cattle analysed using random regression models

    Get PDF
    Telomeres cap the ends of linear chromosomes and shorten with age in many organisms. In humans short telomeres have been linked to morbidity and mortality. With the accumulation of longitudinal datasets the focus shifts from investigating telomere length (TL) to exploring TL change within individuals over time. Some studies indicate that the speed of telomere attrition is predictive of future disease. The objectives of the present study were to 1) characterize the change in bovine relative leukocyte TL (RLTL) across the lifetime in Holstein Friesian dairy cattle, 2) estimate genetic parameters of RLTL over time and 3) investigate the association of differences in individual RLTL profiles with productive lifespan. RLTL measurements were analysed using Legendre polynomials in a random regression model to describe TL profiles and genetic variance over age. The analyses were based on 1,328 repeated RLTL measurements of 308 female Holstein Friesian dairy cattle. A quadratic Legendre polynomial was fitted to the fixed effect of age in months and to the random effect of the animal identity. Changes in RLTL, heritability and within-trait genetic correlation along the age trajectory were calculated and illustrated. At a population level, the relationship between RLTL and age was described by a positive quadratic function. Individuals varied significantly regarding the direction and amount of RLTL change over life. The heritability of RLTL ranged from 0.36 to 0.47 (SE = 0.05–0.08) and remained statistically unchanged over time. The genetic correlation of RLTL at birth with measurements later in life decreased with the time interval between samplings from near unity to 0.69, indicating that TL later in life might be regulated by different genes than TL early in life. Even though animals differed in their RLTL profiles significantly, those differences were not correlated with productive lifespan (p = 0.954)

    Detection of Critical Events in Renewable Energy Production Time Series

    Get PDF
    The introduction of more renewable energy sources into the energy system increases the variability and weather dependence of electricity generation. Power system simulations are used to assess the adequacy and reliability of the electricity grid over decades, but often become computational intractable for such long simulation periods with high technical detail. To alleviate this computational burden, we investigate the use of outlier detection algorithms to find periods of extreme renewable energy generation which enables detailed modelling of the performance of power systems under these circumstances. Specifically, we apply the Maximum Divergent Intervals (MDI) algorithm to power generation time series that have been derived from ERA5 historical climate reanalysis covering the period from 1950 through 2019. By applying the MDI algorithm on these time series, we identified intervals of extreme low and high energy production. To determine the outlierness of an interval different divergence measures can be used. Where the cross-entropy measure results in shorter and strongly peaking outliers, the unbiased Kullback-Leibler divergence tends to detect longer and more persistent intervals. These intervals are regarded as potential risks for the electricity grid by domain experts, showcasing the capability of the MDI algorithm to detect critical events in these time series. For the historical period analysed, we found no trend in outlier intensity, or shift and lengthening of the outliers that could be attributed to climate change. By applying MDI on climate model output, power system modellers can investigate the adequacy and possible changes of risk for the current and future electricity grid under a wider range of scenarios

    Detection of Critical Events in Renewable Energy Production Time Series

    No full text
    The introduction of more renewable energy sources into the energy system increases the variability and weather dependence of electricity generation. Power system simulations are used to assess the adequacy and reliability of the electricity grid over decades, but often become computational intractable for such long simulation periods with high technical detail. To alleviate this computational burden, we investigate the use of outlier detection algorithms to find periods of extreme renewable energy generation which enables detailed modelling of the performance of power systems under these circumstances. Specifically, we apply the Maximum Divergent Intervals (MDI) algorithm to power generation time series that have been derived from ERA5 historical climate reanalysis covering the period from 1950 through 2019. By applying the MDI algorithm on these time series, we identified intervals of extreme low and high energy production. To determine the outlierness of an interval different divergence measures can be used. Where the cross-entropy measure results in shorter and strongly peaking outliers, the unbiased Kullback-Leibler divergence tends to detect longer and more persistent intervals. These intervals are regarded as potential risks for the electricity grid by domain experts, showcasing the capability of the MDI algorithm to detect critical events in these time series. For the historical period analysed, we found no trend in outlier intensity, or shift and lengthening of the outliers that could be attributed to climate change. By applying MDI on climate model output, power system modellers can investigate the adequacy and possible changes of risk for the current and future electricity grid under a wider range of scenarios

    Detection of Critical Events in Renewable Energy Production Time Series

    Get PDF
    The introduction of more renewable energy sources into the energy system increases the variability and weather dependence of electricity generation. Power system simulations are used to assess the adequacy and reliability of the electricity grid over decades, but often become computational intractable for such long simulation periods with high technical detail. To alleviate this computational burden, we investigate the use of outlier detection algorithms to find periods of extreme renewable energy generation which enables detailed modelling of the performance of power systems under these circumstances. Specifically, we apply the Maximum Divergent Intervals (MDI) algorithm to power generation time series that have been derived from ERA5 historical climate reanalysis covering the period from 1950 through 2019. By applying the MDI algorithm on these time series, we identified intervals of extreme low and high energy production. To determine the outlierness of an interval different divergence measures can be used. Where the cross-entropy measure results in shorter and strongly peaking outliers, the unbiased Kullback-Leibler divergence tends to detect longer and more persistent intervals. These intervals are regarded as potential risks for the electricity grid by domain experts, showcasing the capability of the MDI algorithm to detect critical events in these time series. For the historical period analysed, we found no trend in outlier intensity, or shift and lengthening of the outliers that could be attributed to climate change. By applying MDI on climate model output, power system modellers can investigate the adequacy and possible changes of risk for the current and future electricity grid under a wider range of scenarios

    Polyelectrolyte Restorative Materials

    No full text
    corecore