11 research outputs found

    Short recipient warm ischemia time improves outcomes in deceased donor liver transplantation

    Get PDF
    While adverse effects of prolonged recipient warm ischemia time (rWIT) in liver transplantation (LT) have been well investigated, few studies have focused on possible positive prognostic effects of short rWIT. We aim to investigate if shortening rWIT can further improve outcomes in donation after brain death liver transplant (DBD-LT). Primary DBD-LT between 2000 and 2019 were retrospectively reviewed. Patients were divided according to rWIT (≀30, 31-40, 41-50, and \u3e50 min). The requirement of intraoperative transfusion, early allograft dysfunction (EAD), and graft survival were compared between the rWIT groups. A total of 1,256 patients of DBD-LTs were eligible. rWIT was ≀30min in 203 patients (15.7%), 31-40min in 465 patients (37.3%), 41-50min in 353 patients (28.1%), and \u3e50min in 240 patients (19.1%). There were significant increasing trends of transfusion requirement (P \u3c 0.001) and increased estimated blood loss (EBL, P \u3c 0.001), and higher lactate level (P \u3c 0.001) with prolongation of rWIT. Multivariable logistic regression demonstrated the lowest risk of EAD in the WIT ≀30min group. After risk adjustment, patients with rWIT ≀30 min showed a significantly lower risk of graft loss at 1 and 5-years, compared to other groups. The positive prognostic impact of rWIT ≀30min was more prominent when cold ischemia time exceeded 6 h. In conclusion, shorter rWIT in DBD-LT provided significantly better post-transplant outcomes

    Persistence of SARS-CoV-2 Virus in a Kidney Transplant Recipient

    No full text
    Background: We are now discovering the sequelae of the novel coronavirus disease (COVID-19) as they relate to the transplant population. Questions regarding duration of viral shedding, infectivity, and reinfection remain. We present the case of a kidney transplant recipient who had COVID-19 prior to transplantation. The patient had presumably cleared the infection, but subsequently tested positive after transplantation. Case: A 30-year-old female with chronic kidney disease secondary to IgA nephropathy was found to be suitable for a living related kidney transplant. Shortly after her evaluation, before her surgery, she developed symptoms of SARS-CoV-2 virus, and was found to be positive via PCR. She presumably cleared her infection as evidenced by negative PCR testing, two weeks after cessation of symptoms. She underwent robotic-assisted living-related kidney transplantation with basiliximab induction. She developed a postoperative hematoma that required operative evacuation. Thus, she was tested for the virus prior to reoperation, and found to be positive-83 days following symptom onset. She remained asymptomatic by this point. She also tested positive for SARS-CoV-2 IgG antibodies. Conclusion: This case illustrates the persistence of SARS-CoV-2 virus and highlights the potential for viral replication after initiation of immunosuppression. It also highlights the possibility of prolonged viral shedding, beyond the maximum reported timeframe. Depletion of T cells from immunosuppression may explain the persistent viral replication and shedding. The clinical significance of prolonged viral shedding in transplant patients remains undefined. The timing of clearance for transplantation, role of retesting after transplantation, and management of immunosuppression are questions that need to be investigated

    Developing and validation of a liver transplantation donation after cardiac death risk index using the UNOS database

    No full text
    Introduction: Donation after cardiac death (DCD) liver transplantation is an increasing form of organ donation. Shlegal et. al. identified seven factors predicting 1-year DCD graft survival based on the UK transplantation population. This project aims to validate the existing predictive model and to develop a novel DCD graft failure prediction model based on the UNOS database. Methods: We examined all adult DCD transplanted Jan 1 2014 to Mar 31 2020 in the UNOS registry. The population was divided into train (66%) and validation (34%) subsets. Variables of interest were selected from the train subset with backwards stepwise selection with criteria for entry P = 0.05 and exit P = 0.06. Logistic regression models were fitted based on selected variables to predict 1-year graft failure. Performance of the model was assessed in the validation population by computing the area under the receiver operating characteristic curve (AUROC) after 10-fold stratified cross-validation. The performance of the novel model was compared to the UK DCD prediction model. Results: 2738 DCD transplants were included in this study with 1835 in the train and 903 in the validation subsets. The model identified 12 factors predictive for 1-year graft failure among DCD recipients. The model AUROC was 0.741 (95% CI: 0.686, 0.796). When validating the UK DCD model in the UNOS database, the model achieved AUROC of 0.628 (0.564, 0.691). Conclusions: This model identified 12 predictive factors predictive of 1-year graft failure among DCD recipients from the UNOS database, which outperformed the existing model

    Re-transplantation outcomes for hepatitis C in the United States before and after DAA-introduction

    No full text
    The success of direct-acting antiviral (DAA) therapy has led to near-universal cure for patients chronically infected with hepatitis C virus (HCV) and improved post-liver transplant(LT) outcomes. We investigated the trends and outcomes of re-transplantation in HCV and non-HCV patients before and after the introduction of DAA. Adult patients who underwent re-LT were identified in the OPTN/UNOS database. Multi-organ transplants and patients with more than two total LTs were excluded. Two eras were defined, pre-DAA(2009-2012), and post-DAA(2014-2017). A total of 2,112 re-LT patients were eligible(HCV: n=499 pre-DAA and n=322 post-DAA; non-HCV: n=547 pre-DAA and n=744 post-DAA). HCV patients had both improved graft and patient survival after re-LT in the post-DAA era. One-year graft survival was 69.8% pre-DAA and 83.8% post-DAA(p\u3c0.001). One-year patient survival was 73.1% pre-DAA and 86.2% post-DAA(p\u3c0.001).Graft and patient survival was similar between eras for non-HCV patients. When adjusted, the post-DAA era represented an independent positive predictive factor for graft and patient survival(HR:0.67;p=0.005,and HR:0.65;p=0.004) only in HCV patients. The positive post-DAA era effect was observed only in HCV patients with first graft loss due to disease recurrence(HR:0.31;p=0.002, HR 0.32;p=0.003, respectively). Among HCV patients, receiving a re-LT in the post-DAA era was associated with improved patient and graft survival

    Pre-Transplant Prognostic Nutritional Index Predicts Short-Term Outcomes after Liver Transplantation

    No full text
    Introduction: The prognostic nutritional index (PNI) is a serum marker of nutrition and inflammation. PNI previously predicted outcomes in liver transplant (LT) patients with recurrence of hepatocellular carcinoma. However, efficacy of PNI to predict post-LT outcomes is unknown. We hypothesized pre-transplant PNI would predict short-term post-LT outcomes in deceased donor liver transplant (DDLT) patients. Methods: 451 patients underwent primary DDLT between 2013-2018 at Henry Ford Hospital. Re-transplants, multi-organ transplants and living donor liver transplants were excluded. Pre-transplant PNI = (10)*[albumin (g/dL)] + (0.005)*[Total Lymphocyte Count (/”L)]. PNI was analyzed as both a continuous and categorical variable. ROC curves yielded an optimal PNI cutoff of 35 to compare short-term outcomes between PNI≄35 and PNILT. Results: Multivariable analysis associated PNI with 1-year survival as a continuous variable (HR=0.94, 95% CI=0.90-0.98; p=0.007). Of 451 patients, 215 (47.7%) had PNIFigure A). After risk adjustment, PNI(HR=2.44; p=0.047) and 1-year death (HR=2.47; p=0.018). Multivariable analysis revealed PNI(HR=2.37; p=0.023) was an independent risk factor for patient death within 1 year (Figure B). Conclusion: Lower pre-transplant PNI portended worse short-term survival in DDLT patients. PNI may be useful in evaluating pre-transplant nutritional status and optimizing LT outcomes

    Thromboelastography and Liver Transplantation: A Target Group

    No full text
    Liver dysfunction results in derangement of hemostasis and thrombosis. Thromboelastography (TEG) has emerged as a tool to guide resuscitative efforts. We aim to identify a target population, and analyze the effects of TEG on product use and blood loss in LT. Adult patients (age \u3e18 years of age) who received LT between 2014 and 2020 were retrospectively reviewed. Those patients who underwent living donor, simultaneous or multi-organ transplants, re-transplants, and recipient \u3c18 years of age, were excluded. A subgroup analysis was done based on INR at transplant. The median, 75th, and 90th percentile of INR at transplant were used as cut-off values and patients were classified into four categories: no coagulopathy, mild, moderate, and severe coagulopathy groups. Four hundred fifty-one patients met criteria and were separated into TEG (n=144) vs non-TEG (n=307) groups. Median blood products used, and blood loss were similar between TEG and non-TEG groups (Table 1). In the subgroup analysis, there was a significant decrease in product use in the TEG group with moderate coagulopathy; Tranexamic acid (TXA) use was significantly higher in the TEG with moderate coagulopathy group (Table 2). In the no, mild and severe coagulopathy groups, there was no difference in product/TXA use or blood loss between the two groups. TEG guided hemostasis and resuscitation in LT resulted in a decrease in product usage, as well as more utilization of TXA, likely by recognition of hyper-fibrinolysis, in patients with moderate coagulopathy (INR between 2.2 and 2.8)

    Variations in transplant rates and post-transplant outcomes in liver transplantation based on season and climate regions in the United States

    No full text
    Introduction: Cold climate is known to affect the frequency and attributable mortality of various illnesses. Whether similar trends exist in LT donors and recipients remains unknown. This study aims to evaluate the effect of different seasons and regions on the rates and outcomes of liver transplant (LT). Methods: We analyzed the data from United Network for Organ Sharing (UNOS) registry for 50,668 adult patients (≄18 years) who underwent single-organ LT between 2010 and 2019. Patients were categorized by seasons: Summer (n=15,614), Winter (n=18,252), and Spring/Fall (n=16,802). Secondary analysis was performed after stratifying states based on their mean winter temperature (Cold states, 0°-30° F; Intermediate states, 30°-40° F; Warm states, 40°-70° F; Figure 1). Post-LT outcomes were compared according to the season and states using Cox proportional hazard models. Results: Deceased donors during winter were more likely to be older (\u3e50 years, p\u3c0.001), had higher BMI (\u3e25, p\u3c0.001), and died from cerebrovascular disease (p\u3c0.001). Daily LT rates were significantly lower during winter (13.43 transplants/day, p\u3c0.001) and even more pronounced in colder states (Figure 2). The adjusted risk of post-transplant graft loss (HR 1.116, p=0.001) and mortality (HR 1.172, p\u3c0.001) was higher at 1-year in colder states compared to warmer states. Worse post-transplant outcomes in colder states were observed regardless of the season (Winter HR 1.151, p=0.025; spring/fall HR 1.160, p=0.023; Summer HR 1.217, p=0.004). Conclusion: Our study showed significantly lower liver transplant activity during the winter. Colder states had worse post-LT outcomes regardless of the season

    Variation of liver and kidney transplant practice and outcomes during public holidays in the United States

    No full text
    Background: Possible effects of holidays on organ transplant practice/outcomes have not been fully investigated. This study aims to compare the rates of liver and kidney transplant (LT & KT) during different types of holidays and explore whether post-transplant outcomes differ. Methods: We assessed rates of singleorgan LT or KT from 2010 to 2018 for recipients age ≄18 years using the UNOS database. Holidays included Easter/Spring break, Memorial Day, July 4th, Labor Day, Thanksgiving, and Christmas/New-Years (winter holidays). Patients were stratified by transplant timing during holiday±3 d (LT: n=6701; KT: n=15,718) and non-holiday periods (LT: n=43,967, KT: n=102,359). Risk of graft loss and mortality were analyzed using multivariable Cox regression models. Results: KT deceased donors and recipient characteristics were similar between holidays and non-holidays. LT deceased donors during the holidays had shorter cold ischemia time (≀ 8hr, p\u3c0.001) and were not marginal (DCD or \u3e70 years, p=0.001). Compared to non-holidays, there were lower number of transplant during all holidays both in LT and KT (LT 15.5 vs. 15.2 transplants/day, p=0.18, Figure 1A; KT 32.4 vs. 32.1/day, p\u3c0.001, Figure 1B). After risk adjustment, LT during holidays showed lower risk of overall mortality (HR 0.921, p=0.013; Figure 2), while KT showed a slightly higher risk of overall graft failure during holidays compared to non-holidays (HR 1.038, p= 0.052). Conclusion: While favorable donor characteristics were found, LT and KT activities were slower during holidays. Transplants during holidays were associated with significantly better LT outcomes but might lead to worse KT outcomes

    Combined liver and lung transplantation with extended normothermic liver preservation using TransmedicsOrgan Care System (OCS)ℱ liver: A single center experience

    No full text
    Combined liver-lung transplantation (CLLT) is indicated in patients who cannot survive single-organ transplantation alone. Ex-situ normothermic machine perfusion (NMP) has been used to increase the pool of suboptimal donors and has been previously used for extended normothermic lung preservation in CLLT. We aim to describe our single-center experience using the \u27Transmedics Organ Care System (OCS) ℱ liver for extended normothermic liver preservation in CLLT. Results [Values shown as mean (standard deviation)]: Four CLLTs were performed from 2015 to 2020 including 3 male and 1 female recipients, age 50 (±13.7) years (Table 1). Indications for lung transplantation: (1) cystic fibrosis (CF), (1) severe bronchiectasis, and (2) interstitial pulmonary fibrosis. Indications for liver transplantation: (1) biliary cirrhosis secondary to CF, (1) autoimmune hepatitis, (1) alcoholic cirrhosis, and (1) cryptogenic cirrhosis. The lung was transplanted first for all patients. Recipient characteristics at transplant: Mean forced expiratory volume in 1 second (FEV1) was 51% (±22), and Model for End- Stage Liver Disease was 12 (±3.7). The livers were donated after brain death with donor age of 34 (±9.4) years and cold ischemia time 566 (±38) minutes. Ex-vivo pump time for the livers was 411 (±38) minutes (Table 2). Mean hospital stay was 34 days (±18). Over a median follow-up of 201 days, all patients were alive and doing well, while 50% had biopsy-proven acute cellular rejection of the liver. Conclusion: Normothermic extended liver preservation is a safe method to prolong perfusion time and preserve the liver during combined organ transplantation
    corecore