12 research outputs found
Impact of a Novel Adaptive Optimization Algorithm on 30-Day Readmissions Evidence From the Adaptive CRT Trial
AbstractObjectivesThis study investigated the impact of the Medtronic AdaptivCRT (aCRT) (Medtronic, Mounds View, Minnesota) algorithm on 30-day readmissions after heart failure (HF) and all-cause index hospitalizations.BackgroundThe U.S. Hospital Readmission Reduction Program, which includes a focus on HF, reduces Medicare inpatient payments when readmissions within 30 days of discharge exceed a moving threshold based on national averages and hospital-specific risk adjustments. Internationally, readmissions within 30 days of any discharge may attract reduced or no payment. Recently, cardiac resynchronization therapy (CRT) devices equipped with the aCRT algorithm allowing automated ambulatory device programming were introduced. The Adaptive CRT trial demonstrated the algorithm’s safety and comparable outcome against a rigorous echocardiography-based optimization protocol.MethodsWe analyzed data from the Adaptive CRT trial, which randomized patients undergoing CRT defibrillation on a 2:1 basis to aCRT (n = 318) or to CRT with echocardiographic optimization (Echo, n = 160) and followed up these patients for a mean of 20.2 months (range: 0.2 to 31.3 months). Logistic regression with generalized estimating equation methodology was used to compare the proportion of patients hospitalized for HF and for all causes who had a readmission within 30 days.ResultsFor HF hospitalizations, the 30-day readmission rate was 19.1% (17 of 89) in the aCRT group and 35.7% (15 of 42) in the Echo group (odds ratio: 0.41; 95% confidence interval [CI]: 0.19 to 0.86; p = 0.02). For all-cause hospitalization, the 30-day readmission rate was 14.8% (35 of 237) in the aCRT group compared with 24.8% (39 of 157) in the Echo group (odds ratio: 0.54; 95% CI: 0.31 to 0.94; p = 0.03). The risk of readmission after HF or all-cause index hospitalization with aCRT was also significantly reduced beyond 30 days.ConclusionsUse of the aCRT algorithm is associated with a significant reduction in the probability of a 30-day readmission after both HF and all-cause hospitalizations. (Adaptive Cardiac Resynchronization Therapy Study [aCRT]; NCT00980057
Predictors of left ventricular ejection fraction in high-risk percutaneous coronary interventions
Revascularization completeness after percutaneous coronary intervention (PCI) is associated with improved long-term outcomes. Mechanical circulatory support [intra-aortic balloon pump (IABP) or Impella] is used during high-risk PCI (HR-PCI) to enhance peri-procedural safety and achieve more complete revascularization. The relationship between revascularization completeness [post-PCI residual SYNTAX Score (rSS)] and left ventricular ejection fraction (LVEF) in HR-PCI has not been established. We investigated LVEF predictors at 90 days post-PCI with Impella or IABP support. Individual patient data (IPD) were analyzed from PROTECT II (NCT00562016) in the base case. IPD from PROTECT II and RESTORE-EF (NCT04648306) were naïvely pooled in the sensitivity analysis. Using complete cases only, linear regression was used to explore the predictors of LVEF at 90 days post-PCI. Models were refined using stepwise selection based on Akaike Information Criterion and included: treatment group (Impella, IABP), baseline characteristics [age, gender, race, New York Heart Association Functional Classification, LVEF, SYNTAX Score (SS)], and rSS. Impella treatment and higher baseline LVEF were significant predictors of LVEF improvement at 90 days post-PCI (p ≤ 0.05), and a lower rSS contributed to the model (p = 0.082). In the sensitivity analysis, Impella treatment, higher baseline LVEF, and lower rSS were significant predictors of LVEF improvement at 90 days (p ≤ 0.05), and SS pre-PCI contributed to the model (p = 0.070). Higher baseline LVEF, higher SS pre-PCI, lower rSS (i.e. completeness of revascularization), and Impella treatment were predictors of post-PCI LVEF improvement. The findings suggest potential mechanisms of Impella include improving the extent and quality of revascularization, and intraprocedural ventricular unloading
Recommended from our members
Cost-effectiveness of an insertable cardiac monitor in a high-risk population in the US
Background
Insertable cardiac monitors (ICMs) are a clinically effective means of detecting atrial fibrillation (AF) in high-risk patients, and guiding the initiation of non-vitamin K oral anticoagulants (NOACs). Their cost-effectiveness from a US clinical payer perspective is not yet known. The objective of this study was to evaluate the cost-effectiveness of ICMs compared to standard of care (SoC) for detecting AF in patients at high risk of stroke (CHADS2 ≥ 2), in the US.
Methods
Using patient data from the REVEAL AF trial (n = 393, average CHADS2 score = 2.9), a Markov model estimated the lifetime costs and benefits of detecting AF with an ICM or with SoC (specifically intermittent use of electrocardiograms and 24-h Holter monitors). Ischemic and hemorrhagic strokes, intra- and extra-cranial hemorrhages, and minor bleeds were modelled. Diagnostic and device costs, costs of treating stroke and bleeding events and medical therapy—specifically costs of NOACs were included. Costs and health outcomes, measured as quality-adjusted life years (QALYs), were discounted at 3% per annum, in line with standard practice in the US setting. One-way deterministic and probabilistic sensitivity analyses (PSA) were undertaken.
Results
Lifetime per-patient cost for ICM was 25,330 for SoC. ICMs generated a total of 7.75 QALYs versus 7.59 for SoC, with 34 fewer strokes projected per 1000 patients. The model estimates a number needed to treat of 29 per stroke avoided. The incremental cost-effectiveness ratio was 50,000 per QALY threshold, and a 100% probability of being cost-effective at a WTP threshold of $150,000 per QALY.
Conclusions
The use of ICMs to identify AF in a high-risk population is likely to be cost-effective in the US healthcare setting
Economic value of insertable cardiac monitors in unexplained syncope in the United States
Introduction Early use of insertable cardiac monitor (ICM) is recommended for patients with unexplained syncope following initial clinical workup, due to its superior ability to establish symptom-rhythm correlation compared with conventional testing (CONV). However, ICMs incur higher upfront costs, and the impact of additional diagnoses and resulting treatment on downstream costs and outcomes is unclear. We aimed to evaluate the cost-effectiveness of ICM compared with CONV for the diagnosis of arrhythmia in patients with unexplained syncope, from a US payer perspective.Methods A Markov model was developed to estimate lifetime costs and benefits of arrhythmia diagnosis with ICM versus CONV, considering all related diagnostic and arrhythmia-related treatment costs and consequences. Cohort characteristics and costs were informed by original claims database analyses. Risks of mortality, syncopal recurrence, injury due to syncope and quality of life consequences from syncopal events were identified from the literature.Results ICM was less costly and more effective than CONV. Most of the observed US$4532 cost savings were attributed to reduced downstream diagnostic testing. For every 1000 patients, ICM was projected to yield an additional 253 arrhythmia diagnoses and lead to treatment in an additional 168 patients. The ICM strategy resulted in overall improved outcomes (0.30 quality-adjusted life years gained), due to a reduction in syncope recurrence and injury resulting from arrhythmia treatment. The results were robust to changes in the base case parameters but sensitive to the model time horizon, underlying probability of syncope recurrence and prevalence of arrhythmias.Conclusions Our model projected that early ICM for the diagnosis of unexplained syncope reduced long-term costs, and led to an improvement in overall clinical outcomes by shortening time to arrhythmia treatment. The cost of ICM was outweighed by savings arising from fewer downstream diagnostic episodes, and the increased cost of treatment was counterbalanced by fewer syncope-related event costs
Financial impact of adopting implantable loop recorder diagnostic for unexplained syncope compared with conventional diagnostic pathway in Portugal
Background: To estimate the short- and long-term financial impact of early referral for implantable loop recorder
diagnostic (ILR) versus conventional diagnostic pathway (CDP) in the management of unexplained syncope (US) in
the Portuguese National Health Service (PNHS).
Methods: A Markov model was developed to estimate the expected number of hospital admissions due to US and
its respective financial impact in patients implanted with ILR versus CDP. The average cost of a syncope episode
admission was estimated based on Portuguese cost data and landmark papers. The financial impact of ILR adoption
was estimated for a total of 197 patients with US, based on the number of syncope admissions per year in the
PNHS. Sensitivity analysis was performed to take into account the effect of uncertainty in the input parameters
(hazard ratio of death; number of syncope events per year; probabilities and unit costs of each diagnostic test;
probability of trauma and yield of diagnosis) over three-year and lifetime horizons.
Results: The average cost of a syncope event was estimated to be between 1,760€ and 2,800€. Over a lifetime
horizon, the total discounted costs of hospital admissions and syncope diagnosis for the entire cohort were 23%
lower amongst patients in the ILR group compared with the CDP group (1,204,621€ for ILR, versus 1,571,332€
for CDP).
Conclusion: The utilization of ILR leads to an earlier diagnosis and lower number of syncope hospital admissions
and investigations, thus allowing significant cost offsets in the Portuguese setting. The result is robust to changes in
the input parameter values, and cost savings become more pronounced over time
Economic implications of adding a novel algorithm to optimize cardiac resynchronization therapy: rationale and design of economic analysis for the AdaptResponse trial
International audienceAims Although cardiac resynchronization therapy (CRT) has proven beneficial in several randomized trials, a subset of patients have limited clinical improvement. The AdaptivCRT algorithm provides automated selection between synchronized left ventricular or biventricular pacing with optimization of atrioventricular delays. The rationale and design of the economic analysis of the AdaptResponse clinical trial are described. Rationale The costs associated with HF hospitalization are substantial and are compounded by a high rate of readmission. HF hospitalization payments range from 12,235 for US private insurance. When examining the breakdown of HF-related costs, it is clear that approximately 55% of the hospitalization costs are directly attributable to length of stay. Notably, the mean costs of a CRT patient in need of a HF-related hospitalization are currently estimated to be an average of $10,679. Methods The economic analysis of the AdaptResponse trial has two main objectives. The hospital provider objective seeks to test the hypothesis that AdaptivCRT reduces the incidence of all-cause re-admissions after a heart failure admission within 30 days of the index event. A negative binomial regression model will be used to estimate and compare the number of readmissions after an index HF hospitalization. The payer economic objective will assess cost-effectiveness of CRT devices with the AdaptivCRT algorithm relative to traditional CRT programming. This analysis will be conducted from a U.S. payer perspective. A decision analytic model comprised of a 6-month decision tree and a Markov model for long term extrapolation will be used to evaluate lifetime costs and benefits. Conclusion AdaptivCRT may offer improvements over traditional device programming in patient outcomes. How the data from AdaptResponse will be used to demonstrate if these clinical benefits translate into substantial economic gains is herein described
Datasheet1_Predictors of left ventricular ejection fraction in high-risk percutaneous coronary interventions.docx
Revascularization completeness after percutaneous coronary intervention (PCI) is associated with improved long-term outcomes. Mechanical circulatory support [intra-aortic balloon pump (IABP) or Impella] is used during high-risk PCI (HR-PCI) to enhance peri-procedural safety and achieve more complete revascularization. The relationship between revascularization completeness [post-PCI residual SYNTAX Score (rSS)] and left ventricular ejection fraction (LVEF) in HR-PCI has not been established. We investigated LVEF predictors at 90 days post-PCI with Impella or IABP support. Individual patient data (IPD) were analyzed from PROTECT II (NCT00562016) in the base case. IPD from PROTECT II and RESTORE-EF (NCT04648306) were naïvely pooled in the sensitivity analysis. Using complete cases only, linear regression was used to explore the predictors of LVEF at 90 days post-PCI. Models were refined using stepwise selection based on Akaike Information Criterion and included: treatment group (Impella, IABP), baseline characteristics [age, gender, race, New York Heart Association Functional Classification, LVEF, SYNTAX Score (SS)], and rSS. Impella treatment and higher baseline LVEF were significant predictors of LVEF improvement at 90 days post-PCI (p ≤ 0.05), and a lower rSS contributed to the model (p = 0.082). In the sensitivity analysis, Impella treatment, higher baseline LVEF, and lower rSS were significant predictors of LVEF improvement at 90 days (p ≤ 0.05), and SS pre-PCI contributed to the model (p = 0.070). Higher baseline LVEF, higher SS pre-PCI, lower rSS (i.e. completeness of revascularization), and Impella treatment were predictors of post-PCI LVEF improvement. The findings suggest potential mechanisms of Impella include improving the extent and quality of revascularization, and intraprocedural ventricular unloading.</p
Cost-effectiveness of left ventricular assist devices as destination therapy in the United Kingdom
AIMS: Continuous-flow left ventricular assist devices (LVADs) as destination therapy (DT) are a recommended treatment by National Institute for Health and Care Excellence England for end-stage heart failure patients ineligible for cardiac transplantation. Despite the fact that DT is frequently used as an LVAD indication across other major European countries and the United States, with consistent improvements in quality-of-life and longevity, National Health Service (NHS) England does not currently fund DT, mainly due to concerns over cost-effectiveness. On the basis of the recently published ENDURANCE Supplemental Trial studying DT patients, we assessed for the first time the cost-effectiveness of DT LVADs compared with medical management (MM) in the NHS England.
METHODS AND RESULTS: We developed a Markov multiple-state economic model using NHS cost data. LVAD survival and adverse event rates were derived from the ENDURANCE Supplemental Trial. MM survival was based on Seattle Heart Failure Model estimates in the absence of contemporary clinical trials for this population. Incremental cost-effectiveness ratios (ICERs) were calculated over a lifetime horizon. A discount rate of 3.5% per year was applied to costs and benefits. Deterministic ICER was £46 207 per quality-adjusted life year (QALY). Costs and utilities were £204 022 and 3.27 QALYs for the LVAD arm vs. £77 790 and 0.54 QALYs for the MM arm. Sensitivity analyses confirmed robustness of the primary analysis.
CONCLUSIONS: The implantation of the HeartWare™ HVAD™ System in patients ineligible for cardiac transplantation as DT is a cost-effective therapy in the NHS England healthcare system under the end-of-life willingness-to-pay threshold of £50 000/QALY, which applies for VAD patients
Cost‐effectiveness of left ventricular assist devices as destination therapy in the United Kingdom
AIMS: Continuous-flow left ventricular assist devices (LVADs) as destination therapy (DT) are a recommended treatment by National Institute for Health and Care Excellence England for end-stage heart failure patients ineligible for cardiac transplantation. Despite the fact that DT is frequently used as an LVAD indication across other major European countries and the United States, with consistent improvements in quality-of-life and longevity, National Health Service (NHS) England does not currently fund DT, mainly due to concerns over cost-effectiveness. On the basis of the recently published ENDURANCE Supplemental Trial studying DT patients, we assessed for the first time the cost-effectiveness of DT LVADs compared with medical management (MM) in the NHS England.
METHODS AND RESULTS: We developed a Markov multiple-state economic model using NHS cost data. LVAD survival and adverse event rates were derived from the ENDURANCE Supplemental Trial. MM survival was based on Seattle Heart Failure Model estimates in the absence of contemporary clinical trials for this population. Incremental cost-effectiveness ratios (ICERs) were calculated over a lifetime horizon. A discount rate of 3.5% per year was applied to costs and benefits. Deterministic ICER was £46 207 per quality-adjusted life year (QALY). Costs and utilities were £204 022 and 3.27 QALYs for the LVAD arm vs. £77 790 and 0.54 QALYs for the MM arm. Sensitivity analyses confirmed robustness of the primary analysis.
CONCLUSIONS: The implantation of the HeartWare™ HVAD™ System in patients ineligible for cardiac transplantation as DT is a cost-effective therapy in the NHS England healthcare system under the end-of-life willingness-to-pay threshold of £50 000/QALY, which applies for VAD patients