20 research outputs found

    Cardiac magnetic resonance imaging derived quantification of myocardial ischemia and scar improves risk stratification and patient management in stable coronary artery disease

    Get PDF
    Background: Quantification of myocardial ischemia and necrosis might ameliorate prognostic models and lead to improved patient management. However, no standardized consensus on how to assess and quantify these parameters has been established. The aim of this study was to quantify these variables by cardiac magnetic resonance imaging (CMR) and to establish possible incremental implications in cardiovascular risk prediction. Methods: This study is a retrospective analysis of patients with known or suspected coronary artery disease (CAD) referred for adenosine perfusion CMR was performed. Myocardial ischemia and necrosis were assessed and quantified using an algorithm based on standard first-pass perfusion imaging and late gadolinium enhancement (LGE). The combined primary endpoint was defined as cardiac death, non-fatal myocardial infarction, and stroke. Results: 845 consecutive patients were enrolled into the study. During the median follow-up of 3.64 [1.03; 10.46] years, 61 primary endpoints occurred. Patients with primary endpoint showed larger extent of ischemia (10.7 ± 12.25% vs. 3.73 ± 8.29%, p < 0.0001) and LGE (21.09 ± 15.11% vs. 17.73 ± 10.72%, p < 0.0001). A risk prediction model containing the extent of ischemia and LGE proved to be superior in comparison to all other models (χÂČ increase: from 39.678 to 56.676, integrated discrimination index: 0.3851, p = 0.0033, net reclassification index: 0.11516, p = 0.0071). The ben­eficial effect of revascularization tended to be higher in patients with greater extents of ischemia, though statistical significance was not reached. Conclusions: Quantification of myocardial ischemia and LGE was shown to significantly improve existing risk prediction models and might thus lead to an improvement in patient management

    Myocardial strain characteristics and outcomes after transcatheter aortic valve replacement

    Get PDF
       Background: Objective of this study was to make an assessment of standard functional and defor­mation parameters (strain) in patients after transcatheter aortic valve replacement (TAVR) by cardiac magnetic resonance imaging (CMR) and the evaluation of their prognostic impact. Methods: Patients undergoing TAVR received CMR on a 1.5 T whole-body scanner at 3 months after the procedure. Deformation parameters (strain, strain rate, velocity, displacement) were assessed in lon­gitudinal, circumferential and radial orientation using a feature tracking approach. Primary outcome measure was defined according to Valve Academic Research Consortium-2 (VARC-2) criteria. Results: Eighty-three patients formed the study population. Deformation parameters were significantly reduced in all three orientations for strain (longitudinal: –12.1 ± 5.4% vs. –15.9 ± 1.96%, p < 0.0001; radial: 34.4 ± 15.3% vs. 47.2 ± 11.4%, p < 0.0001; circumferential: –16.8 ± 4.3% vs. –21.1 ± 2.5%, p < 0.0001) and strain rate (longitudinal: –0.79 ± 0.33%/s vs. –0.91 ± 0.23%/s, p = 0.043; radial: 2.5 ± 1.2%/s vs. 2.9 ± 0.9%, p = 0.067; circumferential: –1.1 ± 0.6%/s vs. –1.3 ± 0.3%/s, p = 0.006) in comparison to a healthy control population. Median follow-up was 614 days. During this period, 13 endpoints occurred (cumulative event rate of 10.7%). Patients with event by trend exhibited poorer strain and strain rate in longitudinal and radial orientation without reaching statistical significance (longitudinal strain: –11.2 ± 5.4% vs. –12.3 ± 5.4%, p = 0.52; longitudinal strain rate: –0.73 ± ± 0.23%/s vs. 0.80 ± 0.35%/s, p = 0.53; radial strain: 29.5 ± 19.6% vs. 35.2 ± 14.5%, p = 0.24; radial strain rate: 2.2 ± 1.6%/s vs. 2.6 ± 1.2%/s, p = 0.31). Conclusions: Assessment of left ventricular deformation parameters by CMR revealed functional abnormalities in comparison to healthy controls. Prognostic significance remains to be further investi­gated.

    Effectiveness of radiation protection systems in the cardiac catheterization laboratory: a comparative study

    Full text link
    BACKGROUND As numbers and complexity of percutaneous coronary interventions are constantly increasing, optimal radiation protection is required to ensure operator safety. Suspended radiation protection systems (SRPS) and protective scatter-radiation absorbing drapes (PAD) are novel methods to mitigate fluoroscopic scattered radiation exposure. The aim of the study was to investigate the effectiveness regarding radiation protection of a SRPS and a PAD in comparison with conventional protection. METHODS A total of 229 cardiac catheterization procedures with SRPS (N = 73), PAD (N = 82) and standard radiation protection (N = 74) were prospectively included. Real-time dosimeter data were collected from the first operator and the assistant. Endpoints were the cumulative operator exposure relative to the dose area product [standardized operator exposure (SOE)] for the first operator and the assistant. RESULTS For the first operator, the SRPS and the PAD significantly decreased the overall SOE compared to conventional shielding by 93.9% and 66.4%, respectively (P < 0.001). The protective effect of the SRPS was significantly higher compared to the PAD (P < 0.001). For the assistant, the SRPS and the PAD provided a not statistically significant reduction compared to conventional shielding in the overall SOE by 38.0% and 30.6%, respectively. CONCLUSIONS The SRPS and the PAD enhance radiation protection significantly compared to conventional protection. In most clinical scenarios, the protective effect of SRPS is significantly higher than the additional protection provided by the PAD. Comparison of the additional radiation protection provided by protective scatter-radiation absorbing drapes (PAD) and the suspended radiation protection system (SRPS) system over standard protection with lead aprons

    Coronary microvascular dysfunction in Takotsubo syndrome: an analysis using angiography-derived index of microcirculatory resistance

    Get PDF
    BACKGROUND Coronary microvascular dysfunction (CMD) has been proposed as a crucial factor in the pathophysiology of Takotsubo syndrome (TTS). The angiography-derived index of microcirculatory resistance (caIMR) offers an alternative to conventional hyperemic wire-based IMR to assess CMD. We aimed to evaluate CMD's prevalence, transience, and impact on in-hospital outcomes in TTS. METHODS All three coronary arteries of 96 patients with TTS were assessed for their coronary angiography derived Index of microcirculatory Resistance (caIMR) and compared to non-obstructed vessels of matched patients with ST-elevation myocardial infarction. Further, the association between caIMR and the TTS-specific combined in-hospital endpoint of death, cardiac arrest, ventricular arrhythmogenic events and cardiogenic shock was investigated. RESULTS Elevated IMR was present in all TTS patients, with significantly elevated caIMR values in all coronary arteries compared to controls. CaIMR did not differ between apical and midventricular TTS types. CaIMR normalized in TTS patients with follow-up angiographies performed at a median of 28 months (median caIMR at event vs follow-up: LAD 34.8 [29.9-41.1] vs 20.3 [16.0-25.3], p < 0.001; LCX: 38.7 [32.9-50.1] vs 23.7 [19.4-30.5], p < 0.001; RCA: 31.7 [25.0-39.1] vs 19.6 [17.1-24.0], p < 0.001). The extent of caIMR elevation significantly correlated with the combined in-hospital endpoint (p = 0.036). CONCLUSION TTS patients had evidence of elevated caIMR in at least one coronary artery with a trend towards higher LAD caIMR in apical type TTS and normalization after recovery. Furthermore, extent of caIMR elevation was associated with increased risk of in-hospital MACE of TTS patients

    Behandlung schwer verkalkter KoronarlÀsionen

    Full text link
    Treatment of Heavily Calcified Coronary Lesions Abstract. In Switzerland and other industrialized nations, coronary heart disease (CHD) is the most common cause of death in adulthood. CHD is a chronic disease in which stenoses of the epicardial coronary arteries usually cause a deficit in blood supply to the heart muscle tissue, which can lead to chest pain, myocardial infarction, heart failure or cardiac arrhythmia and ultimately to significant morbidity and mortality. Since the first percutaneous coronary intervention (PCI) on 16th September 1977 at the University Hospital of Zurich by Andreas GrĂŒntzig, the field of interventional cardiology has seen remarkable progress in the treatment of coronary artery disease, especially with the development and evolution of coronary stents. Nonetheless, calcified coronary stenoses pose a challenge in everyday interventional practice because they prevent stent implantation or correct expansion or are associated with a higher rate of complications. Unfortunately, to date, there are no established interventions to prevent calcification of the coronary arteries. However, there are some therapeutic approaches that allow PCI in calcified vessels, and these are the focus of this work

    Magnetic resonance Adenosine perfusion imaging as Gatekeeper of invasive coronary intervention (MAGnet): study protocol for a randomized controlled trial

    No full text
    Abstract Background Current guidelines for the diagnosis and management of patients with stable coronary artery disease (CAD) recommend functional stress testing for risk stratification prior to revascularization procedures. Cardiac magnetic resonance imaging (CMR) is a modality of choice for stress testing because of its capability to detect myocardial ischemia sensitively and specifically. Nevertheless, evidence from randomized trials evaluating a CMR-based management of stable CAD patients in comparison to a more common angiography-based approach still is limited. Methods/design Patients presenting themselves with symptoms indicating a stable CAD and a class I or IIa indication for diagnostic coronary angiography are prospectively screened and enrolled in the study. All subjects receive a basic cardiological work-up and guideline-directed medical therapy. A 1:1 randomization in two groups is being performed. Patients in group 1 undergo diagnostic coronary angiography and subsequent revascularization according to current guidelines. Subjects in group 2 undergo adenosine stress CMR and in case of myocardial ischemia are sent to coronary angiography. Follow-up is planned for 3 years. During this time, the number of primary endpoints (defined as cardiac death and non-fatal myocardial infarction) and unplanned invasive procedures will be documented. Furthermore, symptom burden and quality of life will be assessed by use of the Seattle Angina Questionnaire. Sample size is calculated to prove non-inferiority of the CMR-based approach. Discussion In case this study is able to accomplish its aim to prove non-inferiority of the CMR-based management in patients with stable CAD; the importance of this emerging modality may further increase. Trial registration ClinicalTrials.gov, identifier: NCT02580851 . Registered on 14 October 2015. Unique Protocol ID: 237/1

    Reliability and Utility of Various Methods for Evaluation of Bone Union after Anterior Cervical Discectomy and Fusion

    No full text
    Most surgical procedures performed on account of degenerative disease of the cervical spine involve a discectomy and interbody fixation. Bone fusion at the implant placement site is evaluated post-operatively. It is agreed that computed tomography is the best modality for assessing bone union. We evaluated the results obtained with various methods based solely on conventional radiographs in the same group of patients and compared them with results obtained using a method that is a combination of CT and conventional radiography, which we considered the most precise and a reference method. We operated on a total of 170 disc spaces in a group of 104 patients. Fusion was evaluated at 12 months after surgery with five different and popular classifications based on conventional radiographs and then compared with the reference method. Statistical analyses of test accuracy produced the following classification of fusion assessment methods with regard to the degree of consistency with the reference method, in descending order: (1) bone bridging is visible on the anterior and/or posterior edge of the operated disc space on a lateral radiograph; (2) change in the value of Cobb’s angle for a motion segment on flexion vs. extension radiographs (threshold for fusion vs. pseudoarthrosis is 2°); (3) change in the interspinous distance between process tips on flexion vs. extension radiographs (threshold of 2 mm); (4) change in the value of Cobb’s angle of a motion segment (threshold of 4°); (5) change in the interspinous distance between process bases on flexion vs. extension radiographs (threshold of 2 mm). When bone union is evaluated on the basis on radiographs, without CT evidence, we suggest using the “bone bridging” criterion as the most reliable commonly used approach to assessing bone union

    Potential utility of urinary chemokine CCL2 to creatinine ratio in prognosis of 5‐year graft failure and mortality post 1‐year protocol biopsy in kidney transplant recipients

    No full text
    Abstract Background Chemokines (chemotactic cytokines) are small proteins which are engaged in many pathophysiological processes, including inflammation and homeostasis. In recent years, application of chemokines in transplant medicine was intensively studied. The aim of this study was to determine the utility of urinary chemokines CCL2 (C‐C motif ligand 2) and CXCL10 (C‐X‐C motif chemokine ligand 10) in prognosis of 5‐year graft failure and mortality post 1‐year protocol biopsy in renal transplant recipients. Methods Forty patients who had a protocol biopsy 1 year after renal transplantation were included. Concentrations of CCL2 and CXCL10 in urine with reference to urine creatinine were measured. All patients were under the supervision of one transplant center. Long‐term outcomes within 5 years after 1‐year posttransplant biopsy were analyzed. Results Urinary CCL2:Cr at the time of biopsy was significantly increased in patients who died or had graft failure. CCL2:Cr was proven to be a significant predictor of 5‐year graft failure and mortality (odds ratio [OR]: 1.09, 95% confidence interval [CI]: 1.02–1.19, p = .02; OR: 1.08, 95% CI: 1.02–1.16, p = .04; respectively). Conclusion Chemokines are easily detected by current methods. In the era of personalized medicine, urinary CCL2:Cr can be considered as a factor providing complementary information regarding risk of graft failure or increased mortality
    corecore