64 research outputs found

    A prospective, observational, multicenter cohort study

    Get PDF
    Background Graft-derived cell-free DNA (GcfDNA), which is released into the blood stream by necrotic and apoptotic cells, is a promising noninvasive organ integrity biomarker. In liver transplantation (LTx), neither conventional liver function tests (LTFs) nor immunosuppressive drug monitoring are very effective for rejection monitoring. We therefore hypothesized that the quantitative measurement of donor-derived cell-free DNA (cfDNA) would have independent value for the assessment of graft integrity, including damage from acute rejection. Methods and findings Traditional LFTs were performed and plasma GcfDNA was monitored in 115 adults post-LTx at three German transplant centers as part of a prospective, observational, multicenter cohort trial. GcfDNA percentage (graft cfDNA/total cfDNA) was measured using droplet digital PCR (ddPCR), based on a limited number of predefined single nucleotide polymorphisms, enabling same-day turn-around. The same method was used to quantify blood microchimerism. GcfDNA was increased >50% on day 1 post-LTx, presumably from ischemia/reperfusion damage, but rapidly declined in patients without graft injury within 7 to 10 d to a median <10%, where it remained for the 1-y observation period. Of 115 patients, 107 provided samples that met preestablished criteria. In 31 samples taken from 17 patients during biopsy- proven acute rejection episodes, the percentage of GcfDNA was elevated substantially (median 29.6%, 95% CI 23.6%–41.0%) compared with that in 282 samples from 88 patients during stable periods (median 3.3%, 95% CI 2.9%–3.7%; p < 0.001). Only slightly higher values (median 5.9%, 95% CI 4.4%–10.3%) were found in 68 samples from 17 hepatitis C virus (HCV)–positive, rejection-free patients. LFTs had low overall correlations (r = 0.28–0.62) with GcfDNA and showed greater overlap between patient subgroups, especially between acute rejection and HCV+ patients. Multivariable logistic regression modeling demonstrated that GcfDNA provided additional LFT-independent information on graft integrity. Diagnostic sensitivity and specificity were 90.3% (95% CI 74.2%–98.0%) and 92.9% (95% CI 89.3%–95.6%), respectively, for GcfDNA at a threshold value of 10%. The area under the receiver operator characteristic curve was higher for GcfDNA (97.1%, 95% CI 93.4%–100%) than for same-day conventional LFTs (AST: 95.7%; ALT: 95.2%; γ-GT: 94.5%; bilirubin: 82.6%). An evaluation of microchimerism revealed that the maximum donor DNA in circulating white blood cells was only 0.068%. GcfDNA percentage can be influenced by major changes in host cfDNA (e.g., due to leukopenia or leukocytosis). One limitation of our study is that exact time-matched GcfDNA and LFT samples were not available for all patient visits. Conclusions In this study, determination of GcfDNA in plasma by ddPCR allowed for earlier and more sensitive discrimination of acute rejection in LTx patients as compared with conventional LFTs. Potential blood microchimerism was quantitatively low and had no significant influence on GcfDNA value. Further research, which should ideally include protocol biopsies, will be needed to establish the practical value of GcfDNA measurements in the management of LTx patients

    Lack of Prognostic Value of T-Wave Alternans for Implantable Cardioverter-Defibrillator Benefit in Primary Prevention

    Get PDF
    BACKGROUND: New methods to identify patients who benefit from a primary prophylactic implantable cardioverter-defibrillator (ICD) are needed. T-wave alternans (TWA) has been shown to associate with arrhythmogenesis of the heart and sudden cardiac death. We hypothesized that TWA might be associated with benefit from ICD implantation in primary prevention. METHODS AND RESULTS: In the EU-CERT-ICD (European Comparative Effectiveness Research to Assess the Use of Primary Prophylactic Implantable Cardioverter-Defibrillators) study, we prospectively enrolled 2327 candidates for primary prophylactic ICD. A 24-hour Holter monitor reading was taken from all recruited patients at enrollment. TWA was assessed from Holter monitoring using the modified moving average method. Study outcomes were all-cause death, appropriate shock, and survival benefit. TWA was assessed both as a contiguous variable and as a dichotomized variable with cutoff points <47 μV and <60 μV. The final cohort included 1734 valid T-wave alternans samples, 1211 patients with ICD, and 523 control patients with conservative treatment, with a mean follow-up time of 2.3 years. TWA ≥60 μV was a predicter for a higher all-cause death in patients with an ICD on the basis of a univariate Cox regression model (hazard ratio, 1.484 [95% CI, 1.024–2.151]; P=0.0374; concordance statistic, 0.51). In multivariable models, TWA was not prognostic of death or appropriate shocks in patients with an ICD. In addition, TWA was not prognostic of death in control patients. In a propensity score–adjusted Cox regression model, TWA was not a predictor of ICD benefit. CONCLUSIONS: T-wave alternans is poorly prognostic in patients with a primary prophylactic ICD. Although it may be prognostic of life-threatening arrhythmias and sudden cardiac death in several patient populations, it does not seem to be useful in assessing benefit from ICD therapy in primary prevention among patients with an ejection fraction of ≤35%

    Loss of PTB or Negative Regulation of Notch mRNA Reveals Distinct Zones of Notch and Actin Protein Accumulation in Drosophila Embryo

    Get PDF
    Polypyrimidine Tract Binding (PTB) protein is a regulator of mRNA processing and translation. Genetic screens and studies of wing and bristle development during the post-embryonic stages of Drosophila suggest that it is a negative regulator of the Notch pathway. How PTB regulates the Notch pathway is unknown. Our studies of Drosophila embryogenesis indicate that (1) the Notch mRNA is a potential target of PTB, (2) PTB and Notch functions in the dorso-lateral regions of the Drosophila embryo are linked to actin regulation but not their functions in the ventral region, and (3) the actin-related Notch activity in the dorso-lateral regions might require a Notch activity at or near the cell surface that is different from the nuclear Notch activity involved in cell fate specification in the ventral region. These data raise the possibility that the Drosophila embryo is divided into zones of different PTB and Notch activities based on whether or not they are linked to actin regulation. They also provide clues to the almost forgotten role of Notch in cell adhesion and reveal a role for the Notch pathway in cell fusions

    Sample size calculation in multi-centre clinical trials

    No full text
    Abstract Background Multi-centre randomized controlled clinical trials play an important role in modern evidence-based medicine. Advantages of collecting data from more than one site are numerous, including accelerated recruitment and increased generalisability of results. Mixed models can be applied to account for potential clustering in the data, in particular when many small centres contribute patients to the study. Previously proposed methods on sample size calculation for mixed models only considered balanced treatment allocations which is an unlikely outcome in practice if block randomisation with reasonable choices of block length is used. Methods We propose a sample size determination procedure for multi-centre trials comparing two treatment groups for a continuous outcome, modelling centre differences using random effects and allowing for arbitrary sample sizes. It is assumed that block randomisation with fixed block length is used at each study site for subject allocation. Simulations are used to assess operation characteristics such as power of the sample size approach. The proposed method is illustrated by an example in disease management systems. Results A sample size formula as well as a lower and upper boundary for the required overall sample size are given. We demonstrate the superiority of the new sample size formula over the conventional approach of ignoring the multi-centre structure and show the influence of parameters such as block length or centre heterogeneity. The application of the procedure on the example data shows that large blocks require larger sample sizes, if centre heterogeneity is present. Conclusion Unbalanced treatment allocation can result in substantial power loss when centre heterogeneity is present but not considered at the planning stage. When only few patients by centre will be recruited, one has to weigh the risk of imbalance between treatment groups due to large blocks and the risk of unblinding due to small blocks. The proposed approach should be considered when planning multi-centre trials

    Detailed analysis of the baseline dose levels and localized radiation spikes in the arc sections of the Large Hadron Collider during Run 2

    No full text
    The Large Hadron Collider (LHC) has eight insertion regions (IRs) which house the large experiments or accelerator equipment. These IRs are interconnected with the arc sections consisting of a periodic magnet structure. During the operation of the LHC small amounts of the beam particles are lost, creating prompt radiation fields in the accelerator tunnels and the adjacent caverns. One of the main loss mechanisms in the LHC arc sections is the interaction of the beam particles with the residual gas molecules. The analysis of the dose levels based on the beam loss measurement data shows that the majority of the measurements have similar levels, which allow to define baseline values for each arc section. The baseline levels decreased during the years 2015, 2016 and stabilised in 2017 and 2018 at annual dose levels below 50 mGy, which can be correlated with the residual gas densities in the LHC arcs. In some location of the arcs radiation spikes exceed the baseline by more than two orders of magnitude. In addition to the analysis of these dose levels a novel approach of identifying local dose maxima and the main driving mechanisms creating these radiation spikes will be presented

    Run 2 prompt dose distribution and evolution at the Large Hadron Collider and implications for future accelerator operation

    No full text
    During the operation of the Large Hadron Collider (LHC) small fractions of beam particles are lost, creating prompt radiation fields in the accelerator tunnels. Exposed electronics and accelerator components show lifetime degradation and stochastic Single Event Effects (SEEs) which can lead to faults and downtime of the LHC. Close to the experiments the radiation levels scale nicely with the integrated luminosity since the luminosity debris is the major contributor for creating the radiation fields in this area of the LHC. In the collimation regions it was expected that the radiation fields scale with the integrated beam intensities since the beams are continuously cleaned from particles which exceed the accelerator’s acceptance. The analysis of radiation data shows that the dose measurements in the collimation regions normalised with the integrated beam intensities for 2016 and 2017 are comparable. Against expectations, the intensity normalised radiation datasets of 2018 in these regions differ significantly from the previous years. Especially in the betatron collimation region the radiation levels are up to a factor 3 higher. The radiation levels in the collimation regions correlate with the levelling of beta-star and the crossing angle in the high luminosity experiments ATLAS and CMS. These increased normalised doses have direct implications on the expected dose levels during future LHC operation, including the High-Luminosity LHC (HL-LHC) upgrade

    Electrocardiogram as a predictor of survival without appropriate shocks in primary prophylactic ICD patients: A retrospective multi -center study

    No full text
    BACKGROUND: Abnormal 12-lead electrocardiogram (ECG) can predict cardiovascular events, including sudden cardiac death. We tested the hypothesis that ECG provides useful information on guiding implantable cardioverter defibrillator (ICD) therapy into individuals with impaired left ventricular ejection fraction (LVEF). METHODS: Retrospective data of primary prevention ICD implantations from 14 European centers were gathered. The registry included 5111 subjects of whom 1687 patients had an interpretable pre-implantation ECG available (80.0% male, 63.3 ± 11.4 years). Primary outcome was survival without appropriate ICD shocks or heart transplantation. A low-risk ECG was defined as a combination of ECG variables that were associated with the primary outcome. RESULTS: A total of 1224 (72.6%) patients survived the follow-up (2.9 ± 1.7 years) without an ICD shock, 224 (13.3%) received an appropriate shock and 260 (15.4%) died. Low-risk ECG defined as QRS duration <120 ms, QTc interval <450 ms for men and <470 ms for women, and sinus rhythm, were met by 515 patients (30.5%). Multivariable Cox regression showed that the hazard (HR) for death, heart transplantation or appropriate shock were reduced by 42.5% in the low-risk group (HR 0.575; 95% CI 0.45-0.74; p < 0.001), compared to the high-risk group. The HR for the first appropriate shock was 42.1% lower (HR 0.58; 95% CI 0.41-0.82; p = 0.002) and the HR for death was 48.0% lower (HR 0.52; 95% CI 0.386-0.72; p < 0.001) in the low-risk group. CONCLUSION: Sinus rhythm, QRS <120 ms and normal QTc in standard 12-lead ECG provides information about survival without appropriate ICD shocks and might improve patient selection for primary prevention ICD therapy.status: publishe
    • …
    corecore