1,158 research outputs found

    Depression as another possibleexplanation for worse outcomesin myocardial infarction during off-hours

    Get PDF

    Ten Year Real World Experience with Ultrafiltration for the Management of Acute Decompensated Heart Failure

    Get PDF
    Background: Randomized controlled trials (RCT) of ultrafiltration (UF) have demonstrated conflicting results regarding its efficacy and safety. Objective: We reviewed 10 years of data for adjustable UF during heart failure hospitalizations in a real world cohort. Methods: We performed a retrospective, single center analysis of 335 consecutive patients treated with adjustable rate UF using the CHF Solutions Aquadex Flex Flo System from 2009 to 2019. Results: Compared to previous RCTs investigating UF, our cohort was older, with worse renal impairment and more antecedent HF hospitalizations in the year preceding therapy. Mean fluid removal with UF was 14.6 l. Mean weight loss with UF was 15.6 lbs (range 0.2–57 lbs) and was sustained at 1–2 week follow-up. Mean creatinine change upon stopping UF, at discharge and follow-up (mean 30 days) was +0.11 mg/dl, +0.07 mg/dl and +0.11 mg/dl, respectively. HF rehospitalizations at 30 days, 90 days and 1 year were 12.4 %, 14.9 % and 27.3 % respectively. On average patients had 1.74 fewer hospitalizations for HF in the year following UF when compared to 12 months preceding UF. Major bleeding defined as requiring discontinuation of anticoagulation occurred in 3.6 % of patients. Conclusions: Compared with previous UF trials, our study demonstrates that UF compares favorably for HF rehospitalizations, renal function response, and weight/volume loss. Importantly, our real world experience allowed for the adjustment of UF rate during therapy and we believe this is a major contributor to our favorable outcomes. In clinical practice, UF can be a safe and effective strategy for decongestion

    Risk Assessment and Comparative Effectiveness of Left Ventricular Assist Device and Medical Management in Ambulatory Heart Failure Patients The ROADMAP Study 2-Year Results

    Get PDF
    OBJECTIVES The authors sought to provide the pre-specified primary endpoint of the ROADMAP (Risk Assessment and Comparative Effectiveness of Left Ventricular Assist Device and Medical Management in Ambulatory Heart Failure Patients) trial at 2 years. BACKGROUND The ROADMAP trial was a prospective nonrandomized observational study of 200 patients (97 with a left ventricular assist device [LVAD], 103 on optimal medical management [OMM]) that showed that survival with improved functional status at 1 year was better with LVADs compared with OMM in a patient population of ambulatory New York Heart Association functional class IIIb/IV patients. METHODS The primary composite endpoint was survival on original therapy with improvement in 6-min walk distance \u3e= 75 m. RESULTS Patients receiving LVAD versus OMM had lower baseline health-related quality of life, reduced Seattle Heart Failure Model 1-year survival (78% vs. 84%; p = 0.012), and were predominantly INTERMACS (Interagency Registry for Mechanically Assisted Circulatory Support) profile 4 (65% vs. 34%; p \u3c 0.001) versus profiles 5 to 7. More LVAD patients met the primary endpoint at 2 years: 30% LVAD versus 12% OMM (odds ratio: 3.2 [95% confidence interval: 1.3 to 7.7]; p = 0.012). Survival as treated on original therapy at 2 years was greater for LVAD versus OMM (70 +/- 5% vs. 41 +/- 5%; p \u3c 0.001), but there was no difference in intent-to-treat survival (70 +/- 5% vs. 63 +/- 5%; p = 0.307). In the OMM arm, 23 of 103 (22%) received delayed LVADs (18 within 12 months; 5 from 12 to 24 months). LVAD adverse events declined after year 1 for bleeding (primarily gastrointestinal) and arrhythmias. CONCLUSIONS Survival on original therapy with improvement in 6-min walk distance was superior with LVAD compared with OMM at 2 years. Reduction in key adverse events beyond 1 year was observed in the LVAD group. The ROADMAP trial provides risk-benefit information to guide patient- and physician-shared decision making for elective LVAD therapy as a treatment for heart failure. (Risk Assessment and Comparative Effectiveness of Left Ventricular Assist Device and Medical Management in Ambulatory Heart Failure Patients [ROADMAP]; NCT01452802

    Outcomes of open mitral valve replacement versus Transcatheter mitral valve repair; insight from the National Inpatient Sample Database.

    Get PDF
    Background: Transcatheter mitral valve repair and replacement (TMVR) is a minimally invasive alternative to conventional open-heart mitral valve replacement (OMVR). The present study aims to compare the burden, demographics, cost, and complications of TMVR and OMVR. Methods: The United States National Inpatient Sample (US-NIS) for the year 2017 was queried to identify all cases of TMVR and OMVR. Categorical and continuous data were analyzed using Pearson chi-square and independent t-test analysis, respectively. An adjusted odds ratio (aOR) based on the ordinal logistic regression (OLR) model was calculated to determine the association between outcome variables. Results: Of 19,580 patients, 18,460 (94%) underwent OMVR and 1120 (6%) TMVR. Mean ages of patients were 63 Âą 14 years (OMVR) and 67 Âą 13 years (TMVR). Both cohorts were predominantly Caucasian (73% OMVR vs. 74.0% TMVR). The patients who underwent TMVR were more likely to belong to a household with an income in the highest quartile (26.1% vs. 22.0% for OMVR) versus the lowest quartile (22.1% vs. 27.8%). The average number of days from admission to TMVR was less compared to OMVR (2.63 days vs. 3.02 days, p = 0.015). In-hospital length of stay (LOS) was significantly lower for TMVR compared to OMVR (11.56 vs. 14.01 days, p=\u3c0.0001). Adjusted in-hospital mortality taking into account comorbidities showed no significant difference between the two groups (OR 1.2, 0.93-1.68, p = 0.15). Conclusion: Patients undergoing TMVR were older and more financially affluent. TMVR was more costly but was associated with a shorter hospital stay and similar mortality to OMVR

    Lymphocyte-to-C-Reactive Protein Ratio: A Novel Predictor of Adverse Outcomes in COVID-19.

    Get PDF
    Background: Systemic inflammation elicited by a cytokine storm is considered a hallmark of coronavirus disease 2019 (COVID-19). This study aims to assess the validity and clinical utility of the lymphocyte-to-C-reactive protein (CRP) ratio (LCR), typically used for gastric carcinoma prognostication, versus the neutrophil-to-lymphocyte ratio (NLR) for predicting in-hospital outcomes in COVID-19. Methods: A retrospective cohort study was performed to determine the association of LCR and NLR with the need for invasive mechanical ventilation (IMV), dialysis, upgrade to an intensive care unit (ICU) and mortality. Independent Results: The mean age for NLR patients was 63.6 versus 61.6, and for LCR groups, it was 62.6 versus 63.7 years, respectively. The baseline comorbidities across all groups were comparable except that the higher LCR group had female predominance. The mean NLR was significantly higher for patients who died during hospitalization (19 vs. 7, P ≤ 0.001) and those requiring IMV (12 vs. 7, P = 0.01). Compared to alive patients, a significantly lower mean LCR was observed in patients who did not survive hospitalization (1,011 vs. 632, P = 0.04). For patients with a higher NLR (\u3e 10), the unadjusted odds of mortality (odds ratios (ORs) 11.0, 3.6 - 33.0, P \u3c 0.0001) and need for IMV (OR 3.3, 95% CI 1.4 - 7.7, P = 0.008) were significantly higher compared to patients with lower NLR. By contrast, for patients with lower LCR (\u3c 100), the odds of in-hospital all-cause mortality were significantly higher compared to patients with a higher LCR (OR 0.2, 0.06 - 0.47, P = 0.001). The aORs controlled for baseline comorbidities and medications mirrored the overall results, indicating a genuinely significant correlation between these biomarkers and outcomes. Conclusions: A high NLR and decreased LCR value predict higher odds of in-hospital mortality. A high LCR at presentation might indicate impending clinical deterioration and the need for IMV

    Movement of Walleyes in Lakes Erie and St. Clair Inferred from Tag Return and Fisheries Data

    Full text link
    Lake Erie walleyes Sander vitreus support important fisheries and have been managed as one stock, although preliminary tag return and genetic analyses suggest the presence of multiple stocks that migrate among basins within Lake Erie and into other portions of the Great Lakes. We examined temporal and spatial movement and abundance patterns of walleye stocks in the three basins of Lake Erie and in Lake St. Clair with the use of tag return and sport and commercial catchâ perâ unit effort (CPUE) data from 1990 to 2001. Based on summer tag returns, western basin walleyes migrated to the central and eastern basins of Lake Erie and to Lake St. Clair and southern Lake Huron, while fish in the central and eastern basins of Lake Erie and in Lake St. Clair were primarily caught within the basins where they were tagged. Seasonal changes in sport and commercial effort and CPUE in Lake Erie confirmed the walleye movements suggested by tag return data. Walleyes tagged in the western basin but recaptured in the central or eastern basin of Lake Erie were generally larger (or older) than those recaptured in the western basin of Lake Erie or in Lake St. Clair. Within spawning stocks, female walleyes had wider ranges of movement than males and there was considerable variation in movement direction, minimum distance moved (mean distance between tagging sites and recapture locations), and mean length among individual spawning stocks. Summer temperatures in the western basin often exceeded the optimal temperature (20â 23°C) for growth of large walleyes, and the migration of western basin walleyes might represent a sizeâ dependent response to warm summer temperatures. Cooler temperatures and abundant softâ rayed fish probably contributed to an energetically favorable foraging habitat in the central and eastern basins that attracted large walleyes during summer.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/141620/1/tafs0539.pd

    Effects of left ventricular assist device on pulmonary functions and pulmonary hemodynamics: A meta-analysis

    Get PDF
    BACKGROUND: Given current evidence, the effect of left ventricular assist device (LVAD) implantation on pulmonary function tests remains controversial. AIM: To better understand the factors contributing to the changes seen on pulmonary function testing and the correlation with pulmonary hemodynamics after LVAD implantation. METHODS: Electronic databases were queried to identify relevant articles. The summary effect size was estimated as a difference of overall means and standard deviation on a random-effects model. RESULTS: A total of four studies comprising 219 patients were included. The overall mean forced expiratory volume in one second (FEV1), forced vital capacity (FVC) and diffusion lung capacity of carbon monoxide (DLCO) after LVAD implantation were significantly lower by 0.23 L (95%CI: 0.11-0.34, P = 00002), 0.18 L (95%CI: 0.03-0.34, P = 0.02), and 3.16 mmol/min (95%CI: 2.17-4.14, P \u3c 0.00001), respectively. The net post-LVAD mean value of the cardiac index was significantly higher by 0.49 L/min/m2 (95%CI: 0.31-0.66, P \u3c 0.00001) compared to pre-LVAD value. The pulmonary capillary wedge pressure and pulmonary vascular resistance were significantly reduced after LVAD implantation by 8.56 mmHg (95%CI: 3.78-13.35, P = 0.0004), and 0.83 Woods U (95%CI: 0.11-1.55, P = 0.02), respectively. There was no significant difference observed in the right atrial pressure after LVAD implantation (0.61 mmHg, 95%CI: -2.00 to 3.32, P = 0.65). Overall findings appear to be driven by studies using HeartMateII devices. CONCLUSION: LVAD implantation might be associated with a significant reduction of the spirometric measures, including FEV1, FVC, and DLCO, and an overall improvement of pulmonary hemodynamics
    • …
    corecore