23 research outputs found

    Dutch Outcome in Implantable Cardioverter-Defibrillator Therapy:Implantable Cardioverter-Defibrillator-Related Complications in a Contemporary Primary Prevention Cohort

    Get PDF
    Background One third of primary prevention implantable cardioverter-defibrillator patients receive appropriate therapy, but all remain at risk of defibrillator complications. Information on these complications in contemporary cohorts is limited. This study assessed complications and their risk factors after defibrillator implantation in a Dutch nationwide prospective registry cohort and forecasts the potential reduction in complications under distinct scenarios of updated indication criteria. Methods and Results Complications in a prospective multicenter registry cohort of 1442 primary implantable cardioverter-defibrillator implant patients were classified as major or minor. The potential for reducing complications was derived from a newly developed prediction model of appropriate therapy to identify patients with a low probability of benefitting from the implantable cardioverter-defibrillator. During a follow-up of 2.2 years (interquartile range, 2.0-2.6 years), 228 complications occurred in 195 patients (13.6%), with 113 patients (7.8%) experiencing at least one major complication. Most common ones were lead related (n=93) and infection (n=18). Minor complications occurred in 6.8% of patients, with lead-related (n=47) and pocket-related (n=40) complications as the most prevailing ones. A surgical reintervention or additional hospitalization was required in 53% or 61% of complications, respectively. Complications were strongly associated with device type. Application of stricter implant indication results in a comparable proportional reduction of (major) complications. Conclusions One in 13 patients experiences at least one major implantable cardioverter-defibrillator-related complication, and many patients undergo a surgical reintervention. Complications are related to defibrillator implantations, and these should be discussed with the patient. Stricter implant indication criteria and careful selection of device type implanted may have significant clinical and financial benefits

    Five-year safety and efficacy of leadless pacemakers in a Dutch cohort

    Get PDF
    BACKGROUND: Adequate real-world safety and efficacy of leadless pacemakers (LPs) have been demonstrated up to 3 years after implantation. Longer-term data are warranted to assess the net clinical benefit of leadless pacing.OBJECTIVE: The purpose of this study was to evaluate the long-term safety and efficacy of LP therapy in a real-world cohort.METHODS: In this retrospective cohort study, all consecutive patients with a first LP implantation from December 21, 2012, to December 13, 2016, in 6 Dutch high-volume centers were included. The primary safety endpoint was the rate of major procedure- or device-related complications (ie, requiring surgery) at 5-year follow-up. Analyses were performed with and without Nanostim battery advisory-related complications. The primary efficacy endpoint was the percentage of patients with a pacing capture threshold ≤2.0 V at implantation and without ≥1.5-V increase at the last follow-up visit.RESULTS: A total of 179 patients were included (mean age 79 ± 9 years), 93 (52%) with a Nanostim and 86 (48%) with a Micra VR LP. Mean follow-up duration was 44 ± 26 months. Forty-one major complications occurred, of which 7 were not advisory related. The 5-year major complication rate was 4% without advisory-related complications and 27% including advisory-related complications. No advisory-related major complications occurred a median 10 days (range 0-88 days) postimplantation. The pacing capture threshold was low in 163 of 167 patients (98%) and stable in 157 of 160 (98%).CONCLUSION: The long-term major complication rate without advisory-related complications was low with LPs. No complications occurred after the acute phase and no infections occurred, which may be a specific benefit of LPs. The performance was adequate with a stable pacing capture threshold.</p

    Mapping and Surgical Ablation of Focal Epicardial Left Ventricular Tachycardia

    No full text
    We describe a technical challenge in a 17-year-old patient with incessant epicardial focal ventricular arrhythmia and diminished LV function. Failure of ablation at the earliest activated endocardial site during ectopy suggested an epicardial origin, which was supported by specific electrocardiographic criteria. Epicardial ablation was not possible due to the localization of the origin of the ventricular tachycardia adjacent to the phrenic nerve. Minimal invasive surgical multielectrode high-density epicardial mapping was performed to localize the arrhythmia focus. Epicardial surgical RF ablation resulted in the termination of ventricular ectopy. After 2 years, the patient is still free from arrhythmias

    Dominant frequency of atrial fibrillation correlates poorly with atrial fibrillation cycle length

    No full text
    Localized sites of high frequency during atrial fibrillation (AF) are used as target sites to eliminate AF. Spectral analysis is used experimentally to determine these sites. The purpose of this study was to compare dominant frequencies (DF) with AF cycle length (AFCL) of unipolar and bipolar recordings. Left and right atrial endocardial electrograms were recorded during AF in 40 patients with lone AF, using two 20-polar catheters. Mean age was 53+/-9.9 years. Unipolar and bipolar electrograms were recorded simultaneously during 16 seconds at 2 right and 4 left atrial sites. AFCLs and DFs were determined. QRS subtraction was performed in unipolar signals. DFs were compared with mean, median, and mode of AFCLs; 4800 unipolar and 2400 bipolar electrograms were analyzed. Intraclass correlation was poor for all spectral analysis protocols. Best correlation was accomplished with DFs from unipolar electrograms compared with median AFCL (intraclass correlation coefficient, 0.67). A gradient in median AFCL of >25% was detected in 16 of 40 patients. In 13 of 16 patients (81%) with a frequency gradient of >25%, the site with highest frequency was located in the left atrium (posterior left atrium in 8 patients). The site with shortest median AFCL and highest DF corresponded in 25% if unipolar and in 31% if bipolar electrograms were analyzed. DFs from unipolar and bipolar electrograms recorded during AF correlated poorly with mean, median, and mode AFCL. If a frequency gradient >25% existed, the site with highest DF corresponded to the site of shortest median AFCL in only 25% of patients. Because spectral analysis is being used to identify ablation sites, these data may have important clinical implication

    Development and external validation of prediction models to predict implantable cardioverter-defibrillator efficacy in primary prevention of sudden cardiac death

    No full text
    Abstract Aims This study was performed to develop and externally validate prediction models for appropriate implantable cardioverter-defibrillator (ICD) shock and mortality to identify subgroups with insufficient benefit from ICD implantation. Methods and results We recruited patients scheduled for primary prevention ICD implantation and reduced left ventricular function. Bootstrapping-based Cox proportional hazards and Fine and Gray competing risk models with likely candidate predictors were developed for all-cause mortality and appropriate ICD shock, respectively. Between 2014 and 2018, we included 1441 consecutive patients in the development and 1450 patients in the validation cohort. During a median follow-up of 2.4 (IQR 2.1–2.8) years, 109 (7.6%) patients received appropriate ICD shock and 193 (13.4%) died in the development cohort. During a median follow-up of 2.7 (IQR 2.0–3.4) years, 105 (7.2%) received appropriate ICD shock and 223 (15.4%) died in the validation cohort. Selected predictors of appropriate ICD shock were gender, NSVT, ACE/ARB use, atrial fibrillation history, Aldosterone-antagonist use, Digoxin use, eGFR, (N)OAC use, and peripheral vascular disease. Selected predictors of all-cause mortality were age, diuretic use, sodium, NT-pro-BNP, and ACE/ARB use. C-statistic was 0.61 and 0.60 at respectively internal and external validation for appropriate ICD shock and 0.74 at both internal and external validation for mortality. Conclusion Although this cohort study was specifically designed to develop prediction models, risk stratification still remains challenging and no large group with insufficient benefit of ICD implantation was found. However, the prediction models have some clinical utility as we present several scenarios where ICD implantation might be postponed

    Renal sympathetic denervation induces changes in heart rate variability and is associated with a lower sympathetic tone

    Get PDF
    BackgroundRenal nerve stimulation (RNS) is used to localize sympathetic nerve tissue for selective renal nerve sympathetic denervation (RDN). Examination of heart rate variability (HRV) provides a way to assess the state of the autonomic nervous system. The current study aimed to examine the acute changes in HRV caused by RNS before and after RDN.Methods and results30 patients with hypertension referred for RDN were included. RNS was performed under general anesthesia before and after RDN. Heart rate (HR) and blood pressure (BP) were continuously monitored. HRV characteristics were assessed 1min before and after RNS and RDN. RNS before RDN elicited a maximum increase in systolic BP of 45 (22)mmHg which was attenuated to 13 (+/- 12)mmHg (

    Renal Nerve Stimulation-Induced Blood Pressure Changes Predict Ambulatory Blood Pressure Response After Renal Denervation.

    No full text
    Blood pressure (BP) response to renal denervation (RDN) is highly variable and its effectiveness debated. A procedural end point for RDN may improve consistency of response. The objective of the current analysis was to look for the association between renal nerve stimulation (RNS)-induced BP increase before and after RDN and changes in ambulatory BP monitoring (ABPM) after RDN. Fourteen patients with drug-resistant hypertension referred for RDN were included. RNS was performed under general anesthesia at 4 sites in the right and left renal arteries, both before and immediately after RDN. RNS-induced BP changes were monitored and correlated to changes in ambulatory BP at a follow-up of 3 to 6 months after RDN. RNS resulted in a systolic BP increase of 50±27 mm Hg before RDN and systolic BP increase of 13±16 mm Hg after RDN (P<0.001). Average systolic ABPM was 153±11 mm Hg before RDN and decreased to 137±10 mm Hg at 3- to 6-month follow-up (P=0.003). Changes in RNS-induced BP increase before versus immediately after RDN and changes in ABPM before versus 3 to 6 months after RDN were correlated, both for systolic BP (R=0.77, P=0.001) and diastolic BP (R=0.79, P=0.001). RNS-induced maximum BP increase before RDN had a correlation of R=0.61 (P=0.020) for systolic and R=0.71 (P=0.004) for diastolic ABPM changes. RNS-induced BP changes before versus after RDN were correlated with changes in 24-hour ABPM 3 to 6 months after RDN. RNS should be tested as an acute end point to assess the efficacy of RDN and predict BP response to RDN
    corecore