36 research outputs found

    Programma

    Get PDF
    Programma scientifico della nuova rivista pubblicata sul portale della Sorbonne Université, diretta da L. Cugny, V. Caporaletti e F. Araújo Costa, la "Revue d'études du Jazz et des Musiques Audiotactiles"

    Noninvasive detection of graft injury after heart transplant using donor-derived cell-free DNA: A prospective multicenter study

    Get PDF
    Standardized donor-derived cell-free DNA (dd-cfDNA) testing has been introduced into clinical use to monitor kidney transplant recipients for rejection. This report describes the performance of this dd-cfDNA assay to detect allograft rejection in samples from heart transplant (HT) recipients undergoing surveillance monitoring across the United States. Venous blood was longitudinally sampled from 740 HT recipients from 26 centers and in a single-center cohort of 33 patients at high risk for antibody-mediated rejection (AMR). Plasma dd-cfDNA was quantified by using targeted amplification and sequencing of a single nucleotide polymorphism panel. The dd-cfDNA levels were correlated to paired events of biopsy-based diagnosis of rejection. The median dd-cfDNA was 0.07% in reference HT recipients (2164 samples) and 0.17% in samples classified as acute rejection (35 samples; P = .005). At a 0.2% threshold, dd-cfDNA had a 44% sensitivity to detect rejection and a 97% negative predictive value. In the cohort at risk for AMR (11 samples), dd-cfDNA levels were elevated 3-fold in AMR compared with patients without AMR (99 samples, P = .004). The standardized dd-cfDNA test identified acute rejection in samples from a broad population of HT recipients. The reported test performance characteristics will guide the next stage of clinical utility studies of the dd-cfDNA assay

    Impact of left ventricular assist device implantation on mitral regurgitation: An analysis from the MOMENTUM 3 trial

    Get PDF
    BACKGROUND: Mitral regurgitation (MR) determines pathophysiology and outcome in advanced heart failure. The impact of left ventricular assist device (LVAD) placement on clinically significant MR and its contribution to long-term outcomes has been sparsely evaluated. METHODS: We evaluated the effect of clinically significant MR on patients implanted in the MOMENTUM 3 trial with either the HeartMate II (HMII) or the HeartMate 3 (HM3) at 2 years. Clinical significance was defined as moderate or severe grade MR determined by site-based echocardiograms. RESULTS: Of 927 patients with LVAD implants without a prior or concomitant mitral valve procedure, 403 (43.5%) had clinically significant MR at baseline. At 1-month of support, residual MR was present in 6.2% of patients with HM3 and 14.3% of patients with HMII (relative risk = 0.43; 95% CI, 0.22-0.84; p = 0.01) with a low rate of worsening at 2 years. Residual MR at 1-month post-implant did not impact 2-year mortality for either the HM3 (hazard ratio [HR],1.41; 95% CI, 0.52-3.89; p = 0.50) or HMII (HR, 0.91; 95% CI, 0.37-2.26; p = 0.84) LVAD. The presence or absence of baseline MR did not influence mortality (HM3 HR, 0.86; 95% CI, 0.56-1.33; p = 0.50; HMII HR, 0.81; 95% CI, 0.54-1.22; p = 0.32), major adverse events or functional capacity. In multivariate analysis, severe baseline MR (p = 0.001), larger left ventricular dimension (p = 0.002), and implantation with the HMII instead of the HM3 LVAD (p = 0.05) were independently associated with an increased likelihood of persistent MR post-implant. CONCLUSIONS: Hemodynamic unloading after LVAD implantation improves clinically significant MR early, sustainably, and to a greater extent with the HM3 LVAD. Neither baseline nor residual MR influence outcomes after LVAD implantation

    Complete Hemodynamic Profiling With Pulmonary Artery Catheters in Cardiogenic Shock Is Associated With Lower In-Hospital Mortality

    Get PDF
    OBJECTIVES: The purpose of this study was to investigate the association between obtaining hemodynamic data from early pulmonary artery catheter (PAC) placement and outcomes in cardiogenic shock (CS). BACKGROUND: Although PACs are used to guide CS management decisions, evidence supporting their optimal use in CS is lacking. METHODS: The Cardiogenic Shock Working Group (CSWG) collected retrospective data in CS patients from 8 tertiary care institutions from 2016 to 2019. Patients were divided by Society for Cardiovascular Angiography and Interventions (SCAI) stages and outcomes analyzed by the PAC-use group (no PAC data, incomplete PAC data, complete PAC data) prior to initiating mechanical circulatory support (MCS). RESULTS: Of 1,414 patients with CS analyzed, 1,025 (72.5%) were male, and 494 (34.9%) presented with myocardial infarction; 758 (53.6%) were in SCAI Stage D shock, and 263 (18.6%) were in Stage C shock. Temporary MCS devices were used in 1,190 (84%) of those in advanced CS stages. PAC data were not obtained in 216 patients (18%) prior to MCS, whereas 598 patients (42%) had complete hemodynamic data. Mortality differed significantly between PAC-use groups within the overall cohort (p \u3c 0.001), and each SCAI Stage subcohort (Stage C: p = 0.03; Stage D: p = 0.05; Stage E: p = 0.02). The complete PAC assessment group had the lowest in-hospital mortality than the other groups across all SCAI stages. Having no PAC assessment was associated with higher in-hospital mortality than complete PAC assessment in the overall cohort (adjusted odds ratio: 1.57; 95% confidence interval: 1.06 to 2.33). CONCLUSIONS: The CSWG is a large multicenter registry representing real-world patients with CS in the contemporary MCS era. Use of complete PAC-derived hemodynamic data prior to MCS initiation is associated with improved survival from CS

    Development of Predictive Models for Continuous Flow Left Ventricular Assist Device Patients using Bayesian Networks

    Get PDF
    Background: Existing prognostic tools for patient selection for ventricular assist devices (VADs) such as the Destination Therapy Risk Score (DTRS) and newly published HeartMate II Risk Score (HMRS) have limited predictive ability, especially with the current generation of continuous flow VADs (cfVADs). This study aims to use a modern machine learning approach, employing Bayesian Networks (BNs), which overcomes some of the limitations of traditional statistical methods. Methods: Retrospective data from 144 patients at Allegheny General Hospital and Integris Health System from 2007 to 2011 were analyzed. 43 data elements were grouped into four sets: demographics, laboratory tests, hemodynamics, and medications. Patients were stratified by survival at 90 days post LVAD. Results: The independent variables were ranked based on their predictive power and reduced to an optimal set of 10: hematocrit, aspartate aminotransferase, age, heart rate, transpulmonary gradient, mean pulmonary artery pressure, use of diuretics, platelet count, blood urea nitrogen and hemoglobin. Two BNs, Naïve Bayes (NB) and Tree-Augmented Naïve Bayes (TAN) outperformed the DTRS in identifying low risk patients (specificity: 91% and 93% vs. 78%) and outperformed HMRS predictions of high risk patients (sensitivity: 80% and 60% vs. 25%). Both models were more accurate than DTRS and HMRS (90% vs. 73% and 84%), Kappa (NB: 0.56 TAN: 0.48, DTRS: 0.14, HMRS: 0.22), and AUC (NB: 80%, TAN: 84%, DTRS: 59%, HMRS: 59%). Conclusion: The Bayesian Network models developed in this study consistently outperformed the DTRS and HMRS on all metrics. An added advantage is their intuitive graphical structure that closely mimics natural reasoning patterns. This warrants further investigation with an expanded patient cohort, and inclusion of adverse event outcomes

    Do low-risk pulmonary arterial hypertension patients really benefit from upfront combination therapy. insight from the AMBITION trial

    No full text
    Background: Based on results of the Ambrisentan and Tadalafil in Patients with Pulmonary Arterial Hypertension (AMBITION) trial, upfront combination therapy is recommended for "low-risk" treatment-naive patients with pulmonary arterial hypertension (PAH). However, conflicting data exist whether adopting this treatment strategy in this risk group is beneficial or well tolerated. Research question: Do patients with low-risk PAH really benefit from upfront combination therapy? Study design and methods: Using the data from the original AMBITION trial, patients with PAH were classified as low, intermediate, or high risk using the Registry to Evaluate Early and Long-term PAH Disease Management 2.0 (REVEAL 2.0) score and the Pulmonary Hypertension Outcomes and Risk Assessment (PHORA) tool. The primary end point was time to clinical worsening (TTCW; including death, hospitalization for PAH worsening, and disease progression) censored at 1- and 3-year post-enrollment. Side effects that led to withdrawal of treatment were also considered. Results: Patients with low-risk PAH categorized by REVEAL 2.0 and PHORA did not see a statistically significant benefit of upfront combination therapy vs monotherapy for TTCW at 1 and 3 years' post-enrollment using Cox proportional analysis (3-year hazard ratio [HR] of 0.40 [95% CI, 0.15-1.06; P = .07] and 0.55 [95% CI, 0.26-1.18; P = .12] for REVEAL 2.0 and PHORA, respectively) or considering TTCW or side effects (3-year HR of 0.75 [95% CI, 0.39-1.47; P = .4] and 0.87 [95% CI, 0.49-1.54; P = .63] for REVEAL 2.0 and PHORA). Patients with low-risk PAH on upfront combination therapy experienced a higher but not significant incidence of side effects using REVEAL 2.0 and PHORA. In contrast, intermediate- or high-risk patients saw a statistically significant benefit of upfront combination therapy considering each of the end points regardless of side effects. Interpretation: This analysis suggests that perhaps some patients with low-risk PAH should be further stratified using other modalities prior to committing to upfront combination therapy, especially when the occurrence of side effects is considered. Further prospective data are needed to validate this hypothesis prior to changes in current guideline directed therapy are contemplated

    Impact of Female Sex on Cardiogenic Shock Outcomes: A Cardiogenic Shock Working Group Report

    No full text
    BACKGROUND: Studies reporting cardiogenic shock (CS) outcomes in women are scarce. OBJECTIVES: The authors compared survival at discharge among women vs men with CS complicating acute myocardial infarction (AMI-CS) and heart failure (HF-CS). METHODS: The authors analyzed 5,083 CS patients in the Cardiogenic Shock Working Group. Propensity score matching (PSM) was performed with the use of baseline characteristics. Logistic regression was performed for log odds of survival. RESULTS: Among 5,083 patients, 1,522 were women (30%), whose mean age was 61.8 ± 15.8 years. There were 30% women and 29.1% men with AMI-CS (P = 0.03). More women presented with de novo HF-CS compared with men (26.2% vs 19.3%; P \u3c 0.001). Before PSM, differences in baseline characteristics and sex-specific outcomes were seen in the HF-CS cohort, with worse survival at discharge (69.9% vs 74.4%; P = 0.009) and a higher rate of maximum Society for Cardiac Angiography and Interventions stage E (26% vs 21%; P = 0.04) in women than in men. Women were less likely to receive pulmonary artery catheterization (52.9% vs 54.6%; P \u3c 0.001), heart transplantation (6.5% vs 10.3%; P \u3c 0.001), or left ventricular assist device implantation (7.8% vs 10%; P = 0.01). Regardless of CS etiology, women had more vascular complications (8.8% vs 5.7%; P \u3c 0.001), bleeding (7.1% vs 5.2%; P = 0.01), and limb ischemia (6.8% vs 4.5%; P = 0.001). More vascular complications persisted in women after PSM (10.4% women vs 7.4% men; P = 0.06). CONCLUSIONS: Women with HF-CS had worse outcomes and more vascular complications than men with HF-CS. More studies are needed to identify barriers to advanced therapies, decrease complications, and improve outcomes of women with CS

    Longterm Survival on LVAD Support: Limitations Driven by Development of Device Complications and End-Organ Dysfunction

    No full text
    Introduction: Survival is nearly 50% after 5 years of LVAD support. While preop variables can predict short-term (ST) survival, correlates of long term (LT) survival remain poorly characterized. Hypothesis: We hypothesize that preop risk stratification will be limited to predicting ST survival and not LT success. Method: Patients (n=16474) undergoing LVAD implant (2012-18) in Intermacs-STS were grouped according to time on support: ST (\u3c1 year, n=4468), mid-term (MT, 1-3 years, n=8991) and LT (≥3 years, n=3015). Separate multiphase hazard analyses were performed to identify correlates of LT survival in those alive and on LVAD support at 1 and 3 years (Ys). Results: Of those alive on LVAD support at 1 Y, the 3, 5, and 6 Y survivals were 75%, 53%, and 45%, respectively. Patients who were alive on LVAD support at 3 Ys had survivals of 60% at 6 Ys. The table shows adjusted associations between clinical variables and mortality for the MT and LT survival groups starting at 1 and 3 Ys, respectively. For the MT group, older, obese and Caucasian patients and those with preop RV dysfunction, active smoking, unmarried status or comorbidities had higher mortality after 1 Y of support. The occurrence of postop malnutrition, renal, and hepatic dysfunction also increased mortality. Finally, each episode of stroke, device infection or device malfunction increased mortality by 13-42%. For the LT group, postop organ dysfunction and malnutrition impacted extended LVAD survival and mortality increased by 10-46% per adverse event. The only preop correlates of survival beyond 3 Ys were older age, Caucasian race, and history of CABG. Conclusion: The success of LVAD support hinges on achieving LT survival. In operative survivors, LT LVAD survival is heavily constrained by the occurrence of events after LVAD placement. Preop, this also limits our ability to provide individualized, LT survival estimates

    Long-term survival on LVAD support: Device complications and end-organ dysfunction limit long-term success

    No full text
    BACKGROUND: Preoperative variables can predict short term left ventricular assist device (LVAD) survival, but predictors of extended survival remain insufficiently characterized. METHOD: Patients undergoing LVAD implant (2012-2018) in the Intermacs registry were grouped according to time on support: short-term (\u3c1 \u3eyear, n = 7,483), mid-term (MT, 1-3 years, n = 5,976) and long-term (LT, ≥3 years, n = 3,015). Landmarked hazard analyses (adjusted hazard ratio, HR) were performed to identify correlates of survival after 1 and 3 years of support. RESULTS: After surviving 1 year of support, additional LVAD survival was less likely in older (HR 1.15 per decade), Caucasian (HR 1.22) and unmarried (HR 1.16) patients (p \u3c 0.05). After 3 years of support, only 3 preoperative characteristics (age, race, and history of bypass surgery, p \u3c 0.05) correlated with extended survival. Postoperative events most negatively influenced achieving LT survival. In those alive at 1 year or 3 years, the occurrence of postoperative renal (creatinine HR MT = 1.09; LT HR = 1.10 per mg/dl) and hepatic dysfunction (AST HR MT = 1.29; LT HR = 1.34 per 100 IU), stroke (MT HR = 1.24; LT HR = 1.42), infection (MT HR = 1.13; LT HR = 1.10), and/or device malfunction (MT HR = 1.22; LT HR = 1.46) reduced extended survival (all p ≤ 0.03). CONCLUSIONS: Success with LVAD therapy hinges on achieving long term survival in more recipients. After 1 year, extended survival is heavily constrained by the occurrence of adverse events and postoperative end-organ dysfunction. The growth of destination therapy intent mandates that future LVAD studies be designed with follow up sufficient for capturing outcomes beyond 24 months
    corecore