4 research outputs found

    Optimal Timing of Administration of Direct-Acting Antivirals for Patients with Hepatitis C-Associated Hepatocellular Carcinoma Undergoing Liver Transplantation

    Get PDF
    Objective: To investigate the optimal timing of direct acting antiviral (DAA) administration in patients with hepatitis C-associated hepatocellular carcinoma (HCC) undergoing liver transplantation (LT). Summary of Background Data: In patients with hepatitis C (HCV) associated HCC undergoing LT, the optimal timing of direct-acting antivirals (DAA) administration to achieve sustained virologic response (SVR) and improved oncologic outcomes remains a topic of much debate. Methods: The United States HCC LT Consortium (2015–2019) was reviewed for patients with primary HCV-associated HCC who underwent LT and received DAA therapy at 20 institutions. Primary outcomes were SVR and HCC recurrence-free survival (RFS). Results: Of 857 patients, 725 were within Milan criteria. SVR was associated with improved 5-year RFS (92% vs 77%, P < 0.01). Patients who received DAAs pre-LT, 0–3 months post-LT, and ≥3 months post-LT had SVR rates of 91%, 92%, and 82%, and 5-year RFS of 93%, 94%, and 87%, respectively. Among 427 HCV treatment-naïve patients (no previous interferon therapy), patients who achieved SVR with DAAs had improved 5-year RFS (93% vs 76%, P < 0.01). Patients who received DAAs pre-LT, 0–3 months post-LT, and ≥3 months post-LT had SVR rates of 91%, 93%, and 78% (P < 0.01) and 5-year RFS of 93%, 100%, and 83% (P = 0.01). Conclusions: The optimal timing of DAA therapy appears to be 0 to 3 months after LT for HCV-associated HCC, given increased rates of SVR and improved RFS. Delayed administration after transplant should be avoided. A prospective randomized controlled trial is warranted to validate these results

    Induction of transplantation tolerance in non-human primate preclinical models

    No full text
    Short-term outcomes following organ transplantation have improved considerably since the availability of cyclosporine ushered in the modern era of immunosuppression. In spite of this, many of the current limitations to progress in the field are directly related to the existing practice of relatively non-specific immunosuppression. These include increased risks of opportunistic infection and cancer, and toxicity associated with long-term immunosuppressive drug exposure. In addition, long-term graft loss continues to result in part from a failure to adequately control the anti-donor immune response. The development of a safe and reliable means of inducing tolerance would ameliorate these issues and improve the lives of transplant recipients, yet given the improving clinical standard of care, the translation of new therapies has become appropriately more cautious and dependent on increasingly predictive preclinical models. While convenient and easy to use, rodent tolerance models have not to date been reliably capable of predicting a therapy's potential efficacy in humans. Non-human primates possess an immune system that more closely approximates that found in humans, and have served as a more rigorous preclinical testing ground for novel therapies. Prior to clinical adaptation therefore, tolerance regimens should be vetted in non-human primates to ensure that there is sufficient potential for efficacy to justify the risk of its application
    corecore