13 research outputs found

    Optimal Timing of Administration of Direct-Acting Antivirals for Patients with Hepatitis C-Associated Hepatocellular Carcinoma Undergoing Liver Transplantation

    Get PDF
    Objective: To investigate the optimal timing of direct acting antiviral (DAA) administration in patients with hepatitis C-associated hepatocellular carcinoma (HCC) undergoing liver transplantation (LT). Summary of Background Data: In patients with hepatitis C (HCV) associated HCC undergoing LT, the optimal timing of direct-acting antivirals (DAA) administration to achieve sustained virologic response (SVR) and improved oncologic outcomes remains a topic of much debate. Methods: The United States HCC LT Consortium (2015–2019) was reviewed for patients with primary HCV-associated HCC who underwent LT and received DAA therapy at 20 institutions. Primary outcomes were SVR and HCC recurrence-free survival (RFS). Results: Of 857 patients, 725 were within Milan criteria. SVR was associated with improved 5-year RFS (92% vs 77%, P < 0.01). Patients who received DAAs pre-LT, 0–3 months post-LT, and ≥3 months post-LT had SVR rates of 91%, 92%, and 82%, and 5-year RFS of 93%, 94%, and 87%, respectively. Among 427 HCV treatment-naïve patients (no previous interferon therapy), patients who achieved SVR with DAAs had improved 5-year RFS (93% vs 76%, P < 0.01). Patients who received DAAs pre-LT, 0–3 months post-LT, and ≥3 months post-LT had SVR rates of 91%, 93%, and 78% (P < 0.01) and 5-year RFS of 93%, 100%, and 83% (P = 0.01). Conclusions: The optimal timing of DAA therapy appears to be 0 to 3 months after LT for HCV-associated HCC, given increased rates of SVR and improved RFS. Delayed administration after transplant should be avoided. A prospective randomized controlled trial is warranted to validate these results

    Nutrition Support in Liver Transplantation and Postoperative Recovery: The Effects of Vitamin D Level and Vitamin D Supplementation in Liver Transplantation

    No full text
    Vitamin D plays an important role in the arena of liver transplantation. In addition to affecting skeletal health significantly, it also clinically exerts immune-modulatory properties. Vitamin D deficiency is one of the nutritional issues in the perioperative period of liver transplantation (LT). Although vitamin D deficiency is known to contribute to higher incidences of acute cellular rejection (ACR) and graft failure in other solid organ transplantation, such as kidneys and lungs, its role in LT is not well understood. The aim of this study was to investigate the clinical implication of vitamin D deficiency in LT. LT outcomes were reviewed in a retrospective cohort of 528 recipients during 2014&ndash;2019. In the pre-transplant period, 55% of patients were vitamin-D-deficient. The serum vitamin D level was correlated with the model for end-stage liver disease (MELD-Na) score. Vitamin D deficiency in the post-transplant period was associated with lower survival after LT, and the post-transplant supplementation of vitamin D was associated with a lower risk of ACR. The optimal vitamin D status and vitamin D supplementation in the post-transplant period may prolong survival and reduce ACR incidence

    Prognostic factors differ according to KRAS mutational status: A classification and regression tree model to define prognostic groups after hepatectomy for colorectal liver metastasis

    No full text
    Background: Although KRAS mutation status is known to affect the prognosis of patients with colorectal liver metastasis, the hierarchical association between other prognostic factors and KRAS status is not fully understood.Methods: Patients who underwent a hepatectomy for colorectal liver metastasis were identified in a multi-institutional international database. A classification and regression tree model was constructed to investigate the hierarchical association between prognostic factors and overall survival relative to KRAS status.Results: Among 1,123 patients, 29.9% (n = 336) had a KRAS mutation. Among wtKRAS patients, the classification and regression tree model identified presence of metastatic lymph nodes as the most important prognostic factor, whereas among mtKRAS patients, carcinoembryonic antigen level was identified as the most important prognostic factor. Among patients with wtKRAS, the highest 5-year overall survival (68.5%) was noted among patients with node negative primary colorectal cancer, solitary colorectal liver metastases, size &lt;4.3 cm. In contrast, among patients with mtKRAS colorectal liver metastases, the highest 5-year overall survival (57.5%) was observed among patients with carcinoembryonic antigen &lt;6 mg/mL. The classification and regression tree model had higher prognostic accuracy than the Fong score (wtKRAS [Akaike's Information Criterion]: classification and regression tree model 3334 vs Fong score 3341; mtKRAS [Akaike's Information Criterion]: classification and regression tree model 1356 vs Fong score 1396).Conclusion: Machine learning methodology outperformed the traditional Fong clinical risk score and identified different factors, based on KRAS mutational status, as predictors of long-term prognosis. (C) 2020 Elsevier Inc. All rights reserved

    Predicting Lymph Node Metastasis in Intrahepatic Cholangiocarcinoma

    Get PDF
    Background The objective of the current study was to develop a model to predict the likelihood of occult lymph node metastasis (LNM) prior to resection of intrahepatic cholangiocarcinoma (ICC).Methods Patients who underwent hepatectomy for ICC between 2000 and 2017 were identified using a multi-institutional database. A novel model incorporating clinical and preoperative imaging data was developed to predict LNM.Results Among 980 patients who underwent resection of ICC, 190 (19.4%) individuals had at least one LNM identified on final pathology. An enhanced imaging model incorporating clinical and imaging data was developed to predict LNM (). The performance of the enhanced imaging model was very good in the training data set (c-index 0.702), as well as the validation data set with bootstrapping resamples (c-index 0.701) and outperformed the preoperative imaging alone (c-index 0.660). The novel model predicted both 5-year overall survival (OS) (low risk 48.4% vs. high risk 18.4%) and 5-year disease-specific survival (DSS) (low risk 51.9% vs. high risk 25.2%, bothp&lt; 0.001). When applied among Nx patients, 5-year OS and DSS of low-risk Nx patients was comparable with that of N0 patients, while high-risk Nx patients had similar outcomes to N1 patients (p&gt; 0.05).Conclusion This tool may represent an opportunity to stratify prognosis of Nx patients and can help inform clinical decision-making prior to resection of ICC

    The Impact of Preoperative CA19-9 and CEA on Outcomes of Patients with Intrahepatic Cholangiocarcinoma

    No full text
    Background: The objective of the current study was to assess the impact of serum CA19-9 and CEA and their combination on survival among patients undergoing surgery for intrahepatic cholangiocarcinoma (ICC). Methods: Patients who underwent curative-intent resection of ICC between 1990 and 2016 were identified using a multi-institutional database. Patients were categorized into four groups based on combinations of serum CA19-9 and CEA (low vs. high). Factors associated with 1-year mortality after hepatectomy were examined. Results: Among 588 patients, 5-year OS was considerably better among patients with low CA19-9/low CEA (54.5%) compared with low CA19-9/high CEA (14.6%), high CA19-9/low CEA (10.0%), or high CA19-9/high CEA (0%) (P < 0.001). No difference in 1-year OS existed between patients who had either high CA19-9 (high CA19-9/low CEA: 70.4%) or high CEA levels (low CA19-9/high CEA: 72.5%) (P = 0.92). Although patients with the most favorable tumor marker profile (low CA19-9/low CEA) had the best 1-year survival (87.9%), 15.1% (n = 39) still died within a year of surgery. Among patients with low CA19-9/low CEA, a high neutrophil-to-lymphocyte ratio (NLR) (odds ratio 1.09; 95% confidence interval 1.03-1.64) and large size tumor (odds ratio 3.34; 95% confidence interval 1.40–8.10) were associated with 1-year mortality (P < 0.05). Conclusions: Patients with either a high CA19-9 and/or high CEA had poor 1-year survival. High NLR and large tumor size were associated with a greater risk of 1-year mortality among patients with favorable tumor marker profile

    A Machine-Based Approach to Preoperatively Identify Patients with the Most and Least Benefit Associated with Resection for Intrahepatic Cholangiocarcinoma: An International Multi-institutional Analysis of 1146 Patients

    Get PDF
    Accurate risk stratification and patient selection is necessary to identify patients who will benefit the most from surgery or be better treated with&nbsp;other non-surgical treatment strategies. We sought to identify which patients in the preoperative setting would likely&nbsp;derive the most or least benefit from resection of intrahepatic cholangiocarcinoma (ICC)
    corecore