20 research outputs found

    The utility of 6-minute walk distance in predicting waitlist mortality for lung transplant candidates.

    Get PDF
    BACKGROUND The lung allocation score (LAS) has led to improved organ allocation for transplant candidates. At present, the 6-minute walk distance (6MWD) is treated as a binary categorical variable of whether or not a candidate can walk more than 150 feet in 6 minutes. In this study, we tested the hypothesis that 6MWD is presently under-utilized with respect to discriminatory power, and that, as a continuous variable, could better prognosticate risk of waitlist mortality. METHODS A retrospective cohort analysis was performed using the Organ Procurement and Transplantation Network/United Network for Organ Sharing (OPTN/UNOS) transplant database. Candidates listed for isolated lung transplant between May 2005 and December 2011 were included. The population was stratified by 6MWD quartiles and unadjusted survival rates were estimated. Multivariable Cox proportional hazards modeling was used to assess the effect of 6MWD on risk of death. The Scientific Registry of Transplant Recipients (SRTR) Waitlist Risk Model was used to adjust for confounders. The optimal 6MWD for discriminative accuracy in predicting waitlist mortality was assessed by receiver-operating characteristic (ROC) curves. RESULTS Analysis was performed on 12,298 recipients. Recipients were segregated into quartiles by distance walked. Waitlist mortality decreased as 6MWD increased. In the multivariable model, significant variables included 6MWD, male gender, non-white ethnicity and restrictive lung diseases. ROC curves discriminated 6-month mortality was best at 655 feet. CONCLUSIONS The 6MWD is a significant predictor of waitlist mortality. A cut-off of 150 feet sub-optimally identifies candidates with increased risk of mortality. A cut-off between 550 and 655 feet is more optimal if 6MWD is to be treated as a dichotomous variable. Utilization of the LAS as a continuous variable could further enhance predictive capabilities

    Enhanced recovery protocols after surgery: A systematic review and meta-analysis of randomized trials in cardiac surgery

    No full text
    Background: Previous meta-analyses combining randomized and observational evidence in cardiac surgery have shown positive impact of enhanced recovery protocols after surgery (ERAS) on postoperative outcomes. However, definitive data based on randomized studies are missing, and the entirety of the ERAS measures and pathway, as recently systematized in guidelines and consensus statements, have not been captured in the published studies. The available literature actually focuses on "ERAS-like" protocols or only limited number of ERAS measures. This study aims at analyzing all randomized studies applying ERAS-like protocols in cardiac surgery for perioperative outcomes. Methods: A meta-analysis of randomized controlled trials (RCTs) comparing ERAS-like with standard protocols of perioperative care was performed (PROSPERO registration CRD42021283765). PRISMA guidelines were used for abstracting and assessing data. Results: Thirteen single center RCTs (N = 1704, 850 in ERAS-like protocol and 854 in the standard care group) were selected. The most common procedures were surgical revascularization (66.3%) and valvular surgery (24.9%). No difference was found in the incidence of inhospital mortality between the ERAS and standard treatment group (risk ratio [RR] 0.61 [0.31; 1.20], p = 0.15). ERAS was associated with reduced intensive care unit (standardized mean difference [SMD] -0.57, p < 0.01) and hospital stay (SMD -0.23, p < 0.01) and reduced rates of overall complications when compared to the standard protocol (RR 0.60, p < 0.01) driven by the reduction in stroke (RR 0.29 [0.13; 0.62], p < 0.01). A significant heterogeneity in terms of the elements of the ERAS protocol included in the studies was observed. Conclusions: ERAS-like protocols have no impact on short-term survival after cardiac surgery but allows for a faster hospital discharge while potentially reducing surgical complications. However, this study highlights a significant nonadherence and heterogeneity to the entirety of ERAS protocols warranting further RCTs in this field including a greater number of elements of the framework

    Comparative effectiveness of antiplatelet therapies for saphenous venous graft occlusion and cardiovascular outcomes: A network meta-analysis

    No full text
    Introduction: The ideal antiplatelet therapy to maintain graft patency after coronary artery bypass graft surgery (CABG) remains controversial. This review of randomized controlled trials (RCTs) aims to compare aspirin monotherapy, ticagrelor monotherapy, dual antiplatelet therapy (DAPT) with aspirin and ticagrelor (Asp+Tica) or with aspirin and clopidogrel (Asp+Clopi) to evaluate differences in post-CABG saphenous vein graft (SVG) occlusion, internal mammary artery (IMA) occlusion, myocardial infarction (MI), bleeding, and all-cause mortality (ACM) rates.Evidence acquisition: The literature review was conducted on several electronic databases, including Medline, Embase, and Cochrane Central, from inception to August 10, 2022. Data was extracted using a predefined proforma. A Bayesian random-effects model was used for calculating point effect estimates (odds ratio and standard deviation). Quality assessment was done using the Cochrane RoB-2 tool.Evidence synthesis: Ten RCTs comprising 2139 patients taking anti-platelets post-CABG were included. For preventing SVG occlusion, Asp+Tica showed the lowest mean AR of 0.144±0.068. Asp+Tica also showed a trend toward lesser postoperative MI risk and lower ACM rates, with a mean AR of 0.040±0.053 and 0.018±0.029, respectively. For maintaining IMA graft patency, Asp+Clopi showed the lowest mean AR of 0.092±0.053. Ticagrelor had the lowest mean AR of 0.049±0.075, with Asp+Tica showing a similar mean AR of 0.049±0.045 for postoperative major bleeding risk.Conclusions: Our analysis demonstrates that Asp+Tica can be the ideal therapy for patients undergoing CABG using SVG as it decreases the risk of post-CABG SVG occlusion and is not associated with a significantly higher risk for major bleeding

    Medication Nonadherence After Lung Transplantation in Adult Recipients.

    Get PDF
    BACKGROUND Our objective was to identify potential avenues for resource allocation and patient advocacy to improve outcomes by evaluating the association between recipient sociodemographic and patient characteristics and medication nonadherence after lung transplantation. METHODS States US adult, lung-only transplantations per the United Network for Organ Sharing database were analyzed from October 1996 through December 2006, based on the period during which nonadherence information was recorded. Generalized linear models were used to determine the association of demographic, disease, and transplantation center characteristics with early nonadherence (defined as within the first year after transplantation) as well as late nonadherence (years 2 to 4 after transplantation). Outcomes comparing adherent and nonadherent patients were also evaluated. RESULTS Patients (n = 7,284) were included for analysis. Early and late nonadherence rates were 3.1% and 10.6%, respectively. Factors associated with early nonadherence were Medicaid insurance compared with private insurance (adjusted odds ratio [AOR] 2.45, 95% confidence interval [CI]: 1.16 to 5.15), and black race (AOR 2.38, 95% CI: 1.08 to 5.25). Medicaid insurance and black race were also associated with late nonadherence (AOR 2.38, 95% CI: 1.51 to 3.73 and OR 1.73, 95% CI: 1.04 to 2.89, respectively), as were age 18 to 20 years (AOR 3.41, 95% CI: 1.29 to 8.99) and grade school or lower education (AOR 1.88, 95% CI: 1.05 to 3.35). Early and late nonadherence were both associated with significantly shorter unadjusted survival (p < 0.001). CONCLUSIONS Identifying patients at risk of nonadherence may enable resource allocation and patient advocacy to improve outcomes

    One Year Outcomes Following Transplantation with COVID-19-Positive Donor Hearts: A National Database Cohort Study

    No full text
    The current understanding of the safety of heart transplantation from COVID-19+ donors is uncertain. Preliminary studies suggest that heart transplants from these donors may be feasible. We analyzed 1-year outcomes in COVID-19+ donor heart recipients using 1:3 propensity matching. The OPTN database was queried for adult heart transplant recipients between 1 January 2020 and 30 September 2022. COVID-19+ donors were defined as those who tested positive on NATs or antigen tests within 21 days prior to procurement. Multiorgan transplants, retransplants, donors without COVID-19 testing, and recipients allocated under the old heart allocation system were excluded. A total of 7211 heart transplant recipients met the inclusion criteria, including 316 COVID-19+ donor heart recipients. Further, 290 COVID-19+ donor heart recipients were matched to 870 COVID-19− donor heart recipients. Survival was similar between the groups at 30 days (p = 0.46), 6 months (p = 0.17), and 1 year (p = 0.07). Recipients from COVID-19+ donors in the matched cohort were less likely to experience postoperative acute rejection prior to discharge (p = 0.01). National COVID-19+ donor heart usage varied by region: region 11 transplanted the most COVID-19+ hearts (15.8%), and region 6 transplanted the fewest (3.2%). Our findings indicate that COVID-19+ heart transplantation can be performed with safe early outcomes. Further analyses are needed to determine if long-term outcomes are equivalent between groups

    Transcontinental heart transplant using SherpaPak cold static storage system

    No full text
    Organ preservation in heart transplantation is key to preventing primary graft dysfunction, the most common cause of early graft loss. Historically, the standard of care was preservation on ice, with no monitoring ability and problems with even temperature distribution. Recently, the SherpaPak, an Food &amp; Drug Administration-approved transport device, has emerged as a solution to these 2 issues, allowing for even temperature distribution, no organ contact with ice, and continuous monitoring during transport. This method of transport falls under static cold preservation, but may allow for longer ischemic times beyond the recommended 4-hour limit. Here, we report a case of heart transplantation using the SherpaPak transport device for a distance of almost 3000 miles and total ischemic time over 7 hours—the longest yet reported for a donor heart transported using the SherpaPak system. The patient had excellent functional outcomes with no evidence of primary graft dysfunction. This case suggests that, with careful donor and recipient selection, SherpaPak may potentially be used for longer distances and ischemic times than initially recommended, as a safe and cheaper alternative to ex-vivo perfusion devices

    The utility of preoperative six-minute-walk distance in lung transplantation

    No full text
    RATIONALE The use of 6-minute-walk distance (6MWD) as an indicator of exercise capacity to predict postoperative survival in lung transplantation has not previously been well studied. OBJECTIVES To evaluate the association between 6MWD and postoperative survival following lung transplantation. METHODS Adult, first time, lung-only transplantations per the United Network for Organ Sharing database from May 2005 to December 2011 were analyzed. Kaplan-Meier methods and Cox proportional hazards modeling were used to determine the association between preoperative 6MWD and post-transplant survival after adjusting for potential confounders. A receiver operating characteristic curve was used to determine the 6MWD value that provided maximal separation in 1-year mortality. A subanalysis was performed to assess the association between 6MWD and post-transplant survival by disease category. MEASUREMENTS AND MAIN RESULTS A total of 9,526 patients were included for analysis. The median 6MWD was 787 ft (25th-75th percentiles = 450-1,082 ft). Increasing 6MWD was associated with significantly lower overall hazard of death (P < 0.001). Continuous increase in walk distance through 1,200-1,400 ft conferred an incremental survival advantage. Although 6MWD strongly correlated with survival, the impact of a single dichotomous value to predict outcomes was limited. All disease categories demonstrated significantly longer survival with increasing 6MWD (P ≀ 0.009) except pulmonary vascular disease (P = 0.74); however, the low volume in this category (n = 312; 3.3%) may limit the ability to detect an association. CONCLUSIONS 6MWD is significantly associated with post-transplant survival and is best incorporated into transplant evaluations on a continuous basis given limited ability of a single, dichotomous value to predict outcomes

    The Public Health Service “Increased Risk” 2020 Policy Change Has not Improved Organ Utilization in the United States: A Nationwide Cohort Study

    No full text
    Objective:. To assess the effects of the 2020 United States Public Health Service (PHS) “Increased Risk” Guidelines update. Background:. Donors labeled as “Increased Risk” for transmission of infectious diseases have been found to have decreased organ utilization rates despite no significant impact on recipient survival. Recently, the PHS provided an updated guideline focused on “Increased Risk” organ donors, which included the removal of the “Increased Risk” label and the elimination of the separate informed consent form, although the actual increased risk status of donors is still ultimately transmitted to transplant physicians. We sought to analyze the effect of this update on organ utilization rates. Methods:. This was a retrospective analysis of the Organ Procurement and Transplantation Network database which compared donor organ utilization in the 2 years before the June 2020 PHS Guideline update for increased-risk donor organs (June 2018–May 2020) versus the 2 years after the update (August 2020–July 2022). The organ utilization rate for each donor was determined by dividing the number of organs transplanted by the total number of organs available for procurement. Student t test and multivariable logistic regression models were used for analysis. Results:. There were 17,272 donors in the preupdate cohort and 17,922 donors in the postupdate cohort; of these, 4,977 (28.8%) and 3,893 (21.7%) donors were considered “Increased Risk”, respectively. There was a 2% decrease in overall organ utilization rates after the update, driven by a 3% decrease in liver utilization rates and a 2% decrease in lung utilization rates. After multivariable adjustment, donors in the postupdate cohort had 10% decreased odds of having all organs transplanted. Conclusions:. The 2020 PHS “Increased Risk” Donor Guideline update was not associated with an increase in organ utilization rates in the first 2 years after its implementation, despite a decrease in the proportion of donors considered to be at higher risk. Further efforts to educate the community on the safe usage of high-risk organs are needed and may increase organ utilization
    corecore