129 research outputs found

    Incorporating ex-vivo lung perfusion into the UK adult lung transplant service: an economic evaluation and decision analytic model

    Get PDF
    Background: An estimated 20–30% of end-stage lung disease patients awaiting lung transplant die whilst on the waiting list due to a shortage of suitable donor lungs. Ex-Vivo Lung Perfusion is a technique that reconditions donor lungs initially not deemed usable in order to make them suitable for transplantation, thereby increasing the donor pool. In this study, an economic evaluation was conducted as part of DEVELOP-UK, a multi-centre study assessing the clinical and cost-effectiveness of the Ex-Vivo Lung Perfusion technique in the United Kingdom. Methods: We estimated the cost-effectiveness of a UK adult lung transplant service combining both standard and Ex-Vivo Lung Perfusion transplants compared to a service including only standard lung transplants. A Markov model was developed and populated with a combination of DEVELOP-UK, published and clinical routine data, and extrapolated to a lifetime horizon. Probabilistic sensitivity and scenario analyses were used to explore uncertainty in the final outcomes. Results: Base-case model results estimated life years gained of 0.040, quality-adjusted life-years (QALYs) gained of 0.045 and an incremental cost per QALY of £90,000 for Ex-Vivo Lung Perfusion. Scenario analyses carried out suggest that an improved rate of converting unusable donor lungs using Ex-Vivo Lung Perfusion, similar resource use post-transplant for both standard and EVLP lung transplant and applying increased waiting list costs would reduce ICERs to approximately £30,000 or below. Conclusion: DEVELOP-UK base-case results suggest that incorporating Ex-Vivo Lung Perfusion into the UK adult lung transplant service is more effective, increasing the number of donor lungs available for transplant, but would not currently be considered cost-effective in the UK using the present NICE threshold. However, results were sensitive to change in some model parameters and in several plausible scenario analyses results indicate that a service incorporating Ex-vivo lung perfusion would be considered cost-effective . Trial registration: ISRCTN registry number: ISRCTN44922411. Date of registration: 06/02/2012. Retrospectively registered

    A fast and comprehensive microdisc laser model applied to all-optical wavelength conversion

    Get PDF
    Abstract: Microdisc lasers (MDLs) are an attractive option for on-chip laser sources, wavelength converters and even all-optical optical memory. We have developed a comprehensive model for the wavelength conversion in MDLs, which is compared with measurements

    Impact of CD4 and CD8 dynamics and viral rebounds on loss of virological control in HIV controllers.

    Get PDF
    HIV controllers (HICs) spontaneously maintain HIV viral replication at low level without antiretroviral therapy (ART), a small number of whom will eventually lose this ability to control HIV viremia. The objective was to identify factors associated with loss of virological control. HICs were identified in COHERE on the basis of ≥5 consecutive viral loads (VL) ≤500 copies/mL over ≥1 year whilst ART-naive, with the last VL ≤500 copies/mL measured ≥5 years after HIV diagnosis. Loss of virological control was defined as 2 consecutive VL >2000 copies/mL. Duration of HIV control was described using cumulative incidence method, considering loss of virological control, ART initiation and death during virological control as competing outcomes. Factors associated with loss of virological control were identified using Cox models. CD4 and CD8 dynamics were described using mixed-effect linear models. We identified 1067 HICs; 86 lost virological control, 293 initiated ART, and 13 died during virological control. Six years after confirmation of HIC status, the probability of losing virological control, initiating ART and dying were 13%, 37%, and 2%. Current lower CD4/CD8 ratio and a history of transient viral rebounds were associated with an increased risk of losing virological control. CD4 declined and CD8 increased before loss of virological control, and before viral rebounds. Expansion of CD8 and decline of CD4 during HIV control may result from repeated low-level viremia. Our findings suggest that in addition to superinfection, other mechanisms, such as low grade viral replication, can lead to loss of virological control in HICs

    Does rapid HIV disease progression prior to combination antiretroviral therapy hinder optimal CD4 + T-cell recovery once HIV-1 suppression is achieved?

    No full text
    Objective: This article compares trends in CD4+ T-cell recovery and proportions achieving optimal restoration (>=500 cells/µl) after viral suppression following combination antiretroviral therapy (cART) initiation between rapid and nonrapid progressors. Methods: We included HIV-1 seroconverters achieving viral suppression within 6 months of cART. Rapid progressors were individuals experiencing at least one CD4+ less than 200 cells/µl within 12 months of seroconverters before cART. We used piecewise linear mixed models and logistic regression for optimal restoration. Results: Of 4024 individuals, 294 (7.3%) were classified as rapid progressors. At the same CD4+ T-cell count at cART start (baseline), rapid progressors experienced faster CD4+ T-cell increases than nonrapid progressors in first month [difference (95% confidence interval) in mean increase/month (square root scale): 1.82 (1.61; 2.04)], which reversed to slightly slower increases in months 1–18 [-0.05 (-0.06; -0.03)] and no significant differences in 18–60 months [-0.003 (-0.01; 0.01)]. Percentage achieving optimal restoration was significantly lower for rapid progressors than nonrapid progressors at months 12 (29.2 vs. 62.5%) and 36 (47.1 vs. 72.4%) but not at month 60 (70.4 vs. 71.8%). These differences disappeared after adjusting for baseline CD4+ T-cell count: odds ratio (95% confidence interval) 0.86 (0.61; 1.20), 0.90 (0.38; 2.17) and 1.56 (0.55; 4.46) at months 12, 36 and 60, respectively. Conclusion: Among people on suppressive antiretroviral therapy, rapid progressors experience faster initial increases of CD4+ T-cell counts than nonrapid progressors, but are less likely to achieve optimal restoration during the first 36 months after cART, mainly because of lower CD4+ T-cell counts at cART initiation

    Comparative effectiveness and safety of non-vitamin K antagonists for atrial fibrillation in clinical practice: GLORIA-AF Registry

    Get PDF

    Comparative effectiveness and safety of non-vitamin K antagonists for atrial fibrillation in clinical practice: GLORIA-AF Registry

    Get PDF
    Background and purpose: Prospectively collected data comparing the safety and effectiveness of individual non-vitamin K antagonists (NOACs) are lacking. Our objective was to directly compare the effectiveness and safety of NOACs in patients with newly diagnosed atrial fibrillation (AF). Methods: In GLORIA-AF, a large, prospective, global registry program, consecutive patients with newly diagnosed AF were followed for 3 years. The comparative analyses for (1) dabigatran vs rivaroxaban or apixaban and (2) rivaroxaban vs apixaban were performed on propensity score (PS)-matched patient sets. Proportional hazards regression was used to estimate hazard ratios (HRs) for outcomes of interest. Results: The GLORIA-AF Phase III registry enrolled 21,300 patients between January 2014 and December 2016. Of these, 3839 were prescribed dabigatran, 4015 rivaroxaban and 4505 apixaban, with median ages of 71.0, 71.0, and 73.0 years, respectively. In the PS-matched set, the adjusted HRs and 95% confidence intervals (CIs) for dabigatran vs rivaroxaban were, for stroke: 1.27 (0.79–2.03), major bleeding 0.59 (0.40–0.88), myocardial infarction 0.68 (0.40–1.16), and all-cause death 0.86 (0.67–1.10). For the comparison of dabigatran vs apixaban, in the PS-matched set, the adjusted HRs were, for stroke 1.16 (0.76–1.78), myocardial infarction 0.84 (0.48–1.46), major bleeding 0.98 (0.63–1.52) and all-cause death 1.01 (0.79–1.29). For the comparison of rivaroxaban vs apixaban, in the PS-matched set, the adjusted HRs were, for stroke 0.78 (0.52–1.19), myocardial infarction 0.96 (0.63–1.45), major bleeding 1.54 (1.14–2.08), and all-cause death 0.97 (0.80–1.19). Conclusions: Patients treated with dabigatran had a 41% lower risk of major bleeding compared with rivaroxaban, but similar risks of stroke, MI, and death. Relative to apixaban, patients treated with dabigatran had similar risks of stroke, major bleeding, MI, and death. Rivaroxaban relative to apixaban had increased risk for major bleeding, but similar risks for stroke, MI, and death. Registration: URL: https://www.clinicaltrials.gov. Unique identifiers: NCT01468701, NCT01671007. Date of registration: September 2013

    Development and internal validation of a clinical prediction model for serious complications after emergency laparotomy

    Get PDF
    Purpose Emergency laparotomy (EL) is a common operation with high risk for postoperative complications, thereby requiring accurate risk stratification to manage vulnerable patients optimally. We developed and internally validated a predictive model of serious complications after EL. Methods Data for eleven carefully selected candidate predictors of 30-day postoperative complications (Clavien-Dindo grade >  = 3) were extracted from the HELAS cohort of EL patients in 11 centres in Greece and Cyprus. Logistic regression with Least Absolute Shrinkage and Selection Operator (LASSO) was applied for model development. Discrimination and calibration measures were estimated and clinical utility was explored with decision curve analysis (DCA). Reproducibility and heterogeneity were examined with Bootstrap-based internal validation and Internal–External Cross-Validation. The American College of Surgeons National Surgical Quality Improvement Program’s (ACS-NSQIP) model was applied to the same cohort to establish a benchmark for the new model. Results From data on 633 eligible patients (175 complication events), the SErious complications After Laparotomy (SEAL) model was developed with 6 predictors (preoperative albumin, blood urea nitrogen, American Society of Anaesthesiology score, sepsis or septic shock, dependent functional status, and ascites). SEAL had good discriminative ability (optimism-corrected c-statistic: 0.80, 95% confidence interval [CI] 0.79–0.81), calibration (optimism-corrected calibration slope: 1.01, 95% CI 0.99–1.03) and overall fit (scaled Brier score: 25.1%, 95% CI 24.1–26.1%). SEAL compared favourably with ACS-NSQIP in all metrics, including DCA across multiple risk thresholds. Conclusion SEAL is a simple and promising model for individualized risk predictions of serious complications after EL. Future external validations should appraise SEAL’s transportability across diverse settings

    Anticoagulant selection in relation to the SAMe-TT2R2 score in patients with atrial fibrillation. the GLORIA-AF registry

    Get PDF
    Aim: The SAMe-TT2R2 score helps identify patients with atrial fibrillation (AF) likely to have poor anticoagulation control during anticoagulation with vitamin K antagonists (VKA) and those with scores >2 might be better managed with a target-specific oral anticoagulant (NOAC). We hypothesized that in clinical practice, VKAs may be prescribed less frequently to patients with AF and SAMe-TT2R2 scores >2 than to patients with lower scores. Methods and results: We analyzed the Phase III dataset of the Global Registry on Long-Term Oral Antithrombotic Treatment in Patients with Atrial Fibrillation (GLORIA-AF), a large, global, prospective global registry of patients with newly diagnosed AF and ≥1 stroke risk factor. We compared baseline clinical characteristics and antithrombotic prescriptions to determine the probability of the VKA prescription among anticoagulated patients with the baseline SAMe-TT2R2 score >2 and ≤ 2. Among 17,465 anticoagulated patients with AF, 4,828 (27.6%) patients were prescribed VKA and 12,637 (72.4%) patients an NOAC: 11,884 (68.0%) patients had SAMe-TT2R2 scores 0-2 and 5,581 (32.0%) patients had scores >2. The proportion of patients prescribed VKA was 28.0% among patients with SAMe-TT2R2 scores >2 and 27.5% in those with scores ≤2. Conclusions: The lack of a clear association between the SAMe-TT2R2 score and anticoagulant selection may be attributed to the relative efficacy and safety profiles between NOACs and VKAs as well as to the absence of trial evidence that an SAMe-TT2R2-guided strategy for the selection of the type of anticoagulation in NVAF patients has an impact on clinical outcomes of efficacy and safety. The latter hypothesis is currently being tested in a randomized controlled trial. Clinical trial registration: URL: https://www.clinicaltrials.gov//Unique identifier: NCT01937377, NCT01468701, and NCT01671007
    corecore