115 research outputs found

    Short recipient warm ischemia time improves outcomes in deceased donor liver transplantation

    Get PDF
    While adverse effects of prolonged recipient warm ischemia time (rWIT) in liver transplantation (LT) have been well investigated, few studies have focused on possible positive prognostic effects of short rWIT. We aim to investigate if shortening rWIT can further improve outcomes in donation after brain death liver transplant (DBD-LT). Primary DBD-LT between 2000 and 2019 were retrospectively reviewed. Patients were divided according to rWIT (≤30, 31-40, 41-50, and \u3e50 min). The requirement of intraoperative transfusion, early allograft dysfunction (EAD), and graft survival were compared between the rWIT groups. A total of 1,256 patients of DBD-LTs were eligible. rWIT was ≤30min in 203 patients (15.7%), 31-40min in 465 patients (37.3%), 41-50min in 353 patients (28.1%), and \u3e50min in 240 patients (19.1%). There were significant increasing trends of transfusion requirement (P \u3c 0.001) and increased estimated blood loss (EBL, P \u3c 0.001), and higher lactate level (P \u3c 0.001) with prolongation of rWIT. Multivariable logistic regression demonstrated the lowest risk of EAD in the WIT ≤30min group. After risk adjustment, patients with rWIT ≤30 min showed a significantly lower risk of graft loss at 1 and 5-years, compared to other groups. The positive prognostic impact of rWIT ≤30min was more prominent when cold ischemia time exceeded 6 h. In conclusion, shorter rWIT in DBD-LT provided significantly better post-transplant outcomes

    Improvements in liver transplant outcomes in patients with HCV/HIV coinfection after the introduction of direct-acting antiviral therapies

    Get PDF
    BACKGROUND: In recipients with HCV/HIV coinfection, the impact that the wider use of direct-acting antivirals (DAAs) has had on post-liver transplant (LT) outcomes has not been evaluated. We investigated the impact of DAAs introduction on post-LT outcome in patients with HCV/HIV coinfection. METHODS: Using Organ Procurement and Transplant Network/United Network for Organ Sharing data, we compared post-LT outcomes in patients with HCV and/or HIV pre- and post-DAAs introduction. We categorized these patients into two eras: pre-DAA (2008-2012 [pre-DAA era]) and post-DAA (2014-2019 [post-DAA era]). To study the impact of DAAs introduction, inverse probability of treatment weighting was used to adjust patient characteristics. RESULTS: A total of 17 215 LT recipients were eligible for this study (HCV/HIV [n = 160]; HIV mono-infection [n = 188]; HCV mono-infection [n = 16 867]). HCV/HIV coinfection and HCV mono-infection had a significantly lower hazard of 1- and 3-year graft loss post-DAA, compared pre-DAA (1-year: adjusted hazard ratio [aHR] 0.29, 95% confidence interval (CI) 0.16-0.53 in HIV/HCV, aHR 0.58, 95% CI 0.54-0.63, respectively; 3-year: aHR 0.30, 95% CI 0.14-0.61, aHR 0.64, 95% CI 0.58-0.70, respectively). The hazards of 1- and 3-year graft loss post-DAA in HIV mono-infection were comparable to those in pre-DAA. HCV/HIV coinfection had significantly lower patient mortality post-DAA, compared to pre-DAA (1-year: aHR 0.30, 95% CI 0.17-0.55; 3-year: aHR 0.31, 95% CI 0.15-0.63). CONCLUSIONS: Post-LT outcomes in patients with coinfection significantly improved and became comparable to those with HCV mono-infection after introducing DAA therapy. The introduction of DAAs supports the use of LT in the setting of HCV/HIV coinfection

    Effects of major liver allocation policy changes on waitlist outcomes in multivisceral transplantation in the United States

    No full text
    Background: Organ allocation in multivisceral transplant (MVT; liver-intestine, liver-pancreas-intestine) is determined based on their ranking in the liver transplant waitlist. MVT candidates do not usually have high laboratory MELDNa (MELD) score and an exception point is given per the OPTN policy. Currently, their exception point is determined as 10% increase in mortality risk to their MELD score. Since 2013, an exception point of 29 has been applied to MVT candidates with approval by regional review board. As major revisions in liver allocation, Share 35 rule and MELDNa score were implemented in 2013 and 2016. The aim of this study was to evaluate effects of these updates in liver allocation policy on waitlist outcomes in MVT. Methods: We examined adult patients who were registered for liver alone (LTA), liver-kidney (L-K), and MVT between 2011 and 2018 by using the UNOS registry. Registration periods were grouped according to the major revisions of liver allocation; 1) pre-Share 35 period (1/1/2011-6/17/2013), 2) post- Share 35 period (6/18/2013-1/10/2016), 3) MELDNa period (1/11/2016- 3/31/2018). 90-day waitlist mortality in MVT candidates were evaluated in each period in comparison with those in LTA/L-K candidates who had similar MELD score (score categories of 20-28 and 29-34) to exception points for MVT candidates. Risks were adjusted by using Fine-Gray regression model. Results: In MVT candidates, while there was no difference between the pre and post-Share 35 periods (HR, 0.96; P=0.29), 90 day-mortality significantly increased in the MELDNa period compared with that in post-Share 35 period (HR, 1.08; P=0.02). Mortality within 90 days in LTA/L-K candidates with MELD score of 20-28 continued to decrease over periods (hazard ratio [HR], 0.91 and 0.82; P=0.042 and \u3c0.001 for pre vs. post-Share 35 periods and post-Share 35 vs. MELDNa periods). 90 day-mortality in LTA/L-K candidates with MELD score of 29-34 significantly decreased in the MELDNa period compared with the post-Share 35 period (HR, 0.78; P\u3c0.001), whereas there was no difference between the pre and post-Share 35 periods (HR, 0.99; P=0.9). Conclusions: While the recent revisions of liver allocation improved waitlist outcomes in LTA/L-K candidates, MVT candidates did not benefit from them and 90 day-mortality significantly increased in the MELDNa period. Exception point for MVT candidates may need to be reconsidered, given the increased number of high score patients

    Risk Factors for Post-Transplant Outcomes in Living and Deceased Donor Liver Transplantation: An Analysis of UNOS Registry

    No full text
    Purpose: The Model for End-Stage Liver Disease (MELD) score has been used for predicting waitlist outcomes in patients with cirrhosis. However, the post-transplant prognostic value of MELD scores considered limited. We hypothesized that there would be populations which might not be suitable for LDLT Zand comparing risk factors in living and deceased donor LT (LDLT and DDLT) would provide useful selection criteria to avoid futile LDLT. In this study, we aimed to identify unique risk factors for poor post-transplant outcome in LDLT Zand DDLT and assess post-transplant prognostic value of the MELD score in LDLT. Methods: This study used data from United Network for Organ Sharing (UNOS) STAR file and included all adult (\u3e 18 years old) recipients who received LT from 2011 to 2018 (LDLT, n=1515; DDLT, n =39802). We stratified recipients by MELD score at LT into the following groups: 6-11, 12-14, 15-19, 20-24, 25-29, 30-34, 35-39, \u3e 40. Risk factors for one year-graft survival were analyzed based on the LT donor types by using a multivariate Cox\u27s regression model. Results: In DDLT, one year-graft survival in patients with MELD scores \u3e30 was significantly worse than that of patients in any other group with MELD scores \u3c29. In LDLT, because there were only 24 patients with MELD \u3e30, we could not analyze its prognostic effect There was no significant difference between 5 score groups with MELD scores \u3c 29 both in DDLT and LDLT. Multivariate analysis revealed that recipient BMI \u3c18. 5, black race, moderate/severe encephalopathy, donor age \u3e50 years were associated with poor graft survival both in DDLT and LDLT. Moderate/severe ascites was an independent risk factor in LDLT; whereas recipient age \u3e50. recipient BMI \u3e40, poor functional status, recipient diabetes, dialysis requirement cold ischemia time, and donation after cardiac death donor were independent risk factors in DDLT. Conclusions: While increasing the number of living donors may allow expansion of the donor pool, careful consideration should be given to the indication for LDLT Zin patients with high MELD score, given poor post-transplant outcomes in DDLT patients with MELD score \u3e30. Negative impact of moderate/severe ascites was significant in LDLT, but not in DDLT, this can potentially be attributed to smaller graft size in the presence of severe portal hypertension

    Post reperfusion portal flow rate impacts liver allograft and patient survival in patients with pre transplant portal vein thrombosis

    No full text
    Background: Pre-transplant (pretx) portal vein thrombosis (PVT) not only makes the technical aspect of liver transplantation more challenging, it also is known to affect outcomes. Few studies describe the impact of post reperfusion portal flow on liver transplant outcomes in the setting of pretx PVT. Methodology: Case records of all liver transplant recipients with pretx PVT from Jan 2010 to May 2017 (n=95) were reviewed. They were divided based on median portal flow after reperfusion into high (\u3e1300 ml/min, n=47) and low group (≤1300 ml/min, n=48). Demographics and intraoperative characteristics were analyzed with postoperative outcomes. Results: Demographic characteristics were similar in both groups. Intraoperative factors such as cardiac output, central venous pressure, pulmonary artery pressure and transfusion requirements were also similar. Postoperatively, higher cumulative rates of biliary strictures at 6 months, 1 year and 2 years were observed in the low flow group compared to high flow group (29.6% Vs 10.8%, 37.8% Vs 10.8% and 40.7% Vs 13.5% respectively, p=0.008). Low flow group also had a higher rate of graft loss at any time period after transplant (HR 3.3, CI 1.07-10.31, p=0.038); and higher 3-year mortality risk (HR 9.3, CI 1.18-73.49, P=0.034). Incidence of postoperative PVT was similar in both groups (12.8% Vs 12.5%, p =1.0). Other postoperative outcomes like hepatic artery thrombosis, early allograft dysfunction, bile leak rates, early and late rejection rates and hospital stay were also similar. On multivariate analysis, low portal flow (HR 4.21, CI 1.26-14.02, p=0.019) and bile leak (HR 5.54, CI 1.95-15.78, p=0.001) were the only factors associated with worse graft survival. Pretx anticoagulation appeared to have a protective effect, though not statistically significant (HR 0.23, CI 0.03-1.76, p=0.16). Conclusions: Low portal flow after reperfusion in the setting of pretx PVT may be an independent negative predictor of allograft survival and patient survival. Post-transplant bile leak may a negative predictor for allograft survival. Pretx anticoagulation in the setting of PVT appears to have some protective effects on graft survival

    Incidence and outcomes of immediate post-operative dialysis in liver transplantation.

    No full text
    Aim: Although kidney dysfunction secondary to hepatorenal syndrome is expected to recover with liver transplant alone (LTA), patients with pre-transplant marginal kidney function may be more susceptible to intra-operative stresses which cause further kidney injury. The aim of this study was to evaluate incidence and outcomes of immediate post-transplant dialysis in liver transplant. Methods: We retrospectively reviewed records of 44 simultaneous liver-kidney transplant patients and 204 LTA patients with pre-transplant marginal kidney function (GFR\u3c60mL/min) from 2009 to 2015. First, we identified the incidence of immediate post-transplant dialysis in all patients and assessed for early liver allograft dysfunction (Olthoff criteria) and liver graft survival. Second, risk factors for post-transplant dialysis were analyzed in LTA patients. Results: Of 44 SLK patients, 12 (27%) needed post-transplant dialysis (median 10.5 days, interquartile range [IQR] 6-28 days). Of 204 LTA patients, 42 were on dialysis pre-transplant, of whom 38 (90%) had persistent dialysis requirement post-transplant (median 9.5 days, IQR 4-31 days). Of the 162 who were not on dialysis pre-transplant, 22 (14%) had dialysis requirement post-transplant (median 4 days, IQR 2-22 days). Patients who required post-transplant dialysis showed significantly worse graft survival compared with those without post-transplant dialysis (P=0.003). Posttransplant dialysis requirement was significantly associated with early liver allograft dysfunction (P=0.03). Pre-transplant dialysis was significantly associated with need for post-transplant dialysis (P\u3c0.001). In LTA patients without pre-transplant dialysis, the following risk factors for post-transplant dialysis were identified as significant: cold ischemia time\u3e350min (P=0.03), warm ischemia time\u3e40min (P=0.02), red blood cell transfusion\u3e10 units (P=0.004), and pre-op GFR\u3c30mL/min (P=0.003). Large volume of red blood cell transfusion and pre-op GFR\u3c30mL/min remained independent risk factors on multivariate analysis. Conclusion: The adverse impact of immediate post-transplant dialysis on early liver allograft function as well as graft survival should be recognized. While posttransplant dialysis is frequently unavoidable in patients requiring pre-transplant dialysis, efforts to reduce intra-op transfusions and shortening ischemia time may decrease risk of post-transplant dialysis in LTA patients with marginal kidney function

    The grade of pre-transplant portal vein thrombosis in liver transplant recipients impacts graft and patient survival

    No full text
    Background: Portal vein thrombosis (PVT) in the pre-transplant (pretx) setting makes liver transplantation technically challenging and is also known to affect post transplant outcomes. Few studies describe the impact of the grade of PVT on morbidity and post transplant survival. Methodology: Case records of all liver transplant recipients with known pretx PVT from Jan 2010 to May 2017 (n=95) were reviewed. All recipients had PVT grades 1 to 3 (Yerdel classification) hence were divided into two groups, those with Grade 1 (n=56) and those with Grade 2 or 3 (n=39). Demographics, graft, operative and intraoperative characteristics were analyzed with postoperative outcomes. Results: Demographic characteristics were similar in both groups. Intraoperative factors such as cardiac output, central venous pressure, pulmonary artery pressure and transfusion requirements were also similar. Postoperative PVT incidence rates were higher in Grade 2/3 group but this was not statistically significant (17.9% Vs 8.9%, p=0.42). In terms of biliary strictures, higher cumulative rates at 6 months, 1 year and 2 years were observed in Grade 2/3 group compared to Grade 1 group (26.2% Vs 16.3%, 33.1% Vs 18.4% and 40.5% Vs 18.4% respectively, p=0.037). Grade 2/3 group also had a higher instantaneous 5-year graft loss risk (HR 3.0, CI 1.01-9.06, p=0.048) and overall instantaneous mortality risk after transplant (HR 2.97, CI 1.03-8.53, P=0.043) compared to Grade 1. Other postoperative complications such as hepatic artery thrombosis, early allograft dysfunction, bile leak rates, early and late rejection rates and hospital stay were similar. On multivariate analysis, grade 2/3 PVT (HR 3.6, CI 1.18-11.03, p=0.025) and bile leak (HR 4.47, CI 1.5-13.3, p=0.007) were the only factors associated with worse 5-year graft survival. Pre (HR 0.17, CI 0.02-1.4, p=0.1) and post transplant anticoagulation (HR 0.53, 0.16-1.7, p =0.3) appear to improve graft survival but there weren\u27t statistically significant. Conclusions: Higher PVT grades may be an independent negative predictor of liver allograft and patient survival. Post-transplant bile leak may be also be a negative predictor for allograft survival. Pre and post transplant anticoagulation appear to have protective effects on graft survival

    Comparison of outcome in liver transplant patients with renal insufficiency and intraoperative CVVH

    No full text
    Background: Intraoperative continuous veno-venous hemofi ltration (CVVH) is an important tool to manage liver transplant (LT) patients with renal insufficiencies. Methods: All LT patients with renal insufficiencies between January 2005 and May 2017 (n=142) were assigned to three groups, those who underwent elective intra-op CVVH (dialysis prior to transplant, necessitating intraoperative CVVH, LTE n=70), unplanned intra-op CVVH (patients who did not require dialysis prior to transplant but were found to have borderline renal insufficiency at the time of LT, LTU n=15) and those with undialysed renal insufficiency (No intraop CVVH or dialysis prior to transplant but had GFR \u3c30 ml/min, LTD n=57). Postoperative complication, graft/patient survival, and long-term renal function were investigated. Results: MELD at transplant was higher in the LTE group (37.5+7.1) compared to the LTU (30.6±10.7) and LTD group (31.7±8.3, P\u3c0.001). Postoperative complication rate (Clavien 3b and above) was similar in all groups, (LTE-45.7%, LTU-46.6% and LTD-26.3%, P=0.06) but LTU patients experienced higher rates of early allograft dysfunction (EAD) (66.6%) compared to the other groups (LTE-30.3% and LTD-25%, P=0.01). Postoperative dialysis requirements was higher in LTE (86.4%) compared to the other groups (66.6% and 10.5%, P\u3c0.001). Duration of dialysis was not significantly different (8, 6 and 29 days, P=0.43). Long-term renal function at 3, 6 and 12 months was similar (P=0.50, P=0.77, P=0.52, respectively). Patient and graft survival was also similar (P=0.51 and P=0.24, respectively). Hospital stay was highest in LTU group (21 days) compared to LTE (16 days) and LTD (13 days, P=0.046). Conclusion: While postoperative complications and graft/patient survival were similar in all three groups, unplanned CVVH may be associated with EAD and longer hospital stay due to acute renal dysfunction at the time of LT. Further investigations are warranted
    • …
    corecore