19 research outputs found

    Dietary Patterns and Hepatocellular Carcinoma Risk among Us adults

    Get PDF
    The objective of this study was to assess the association between dietary patterns and risk of hepatocellular carcinoma (HCC) among US adults in a hospital-based case-control study. We analyzed data from 641 cases and 1002 controls recruited at The University of Texas MD Anderson Cancer Center during 2001-2018. Cases were patients with a pathologically or radiologically confirmed new diagnosis of HCC; controls were cancer-free spouses of patients with cancers other than gastrointestinal, lung, liver, or head and neck cancer. Cases and controls were frequency-matched by age and sex. Dietary patterns were identified by principal component analysis. Odds ratios (ORs) and corresponding confidence intervals (CIs) were computed using unconditional logistic regression with adjustment for major HCC risk factors, including hepatitis B virus and hepatitis C virus infection. A vegetable-based dietary pattern was inversely associated with HCC risk (highest compared with lowest tertile: OR 0.66, 95% CI 0.46-0.94). A Western diet pattern was directly associated with HCC risk (highest compared with lowest tertile: OR 1.79, 95% CI 1.19-2.69). These findings emphasize the potential role of dietary intake in HCC prevention and clinical management

    Regional differences in clinical presentation and prognosis of patients with post-sustained virologic response (SVR) hepatocellular carcinoma.

    Get PDF
    Background& Amis: Widespread use of direct-acting antivirals (DAAs) for hepatitis C virus (HCV) infection has resulted in increased numbers of patients with hepatocellular carcinoma (HCC) after achieving sustained virologic response ('post-SVR HCC') worldwide. Few data compare regional differences in presentation and prognosis of patients with post-SVR HCC.MethodsWe identified patients with advanced fibrosis (F3/F4) who developed incident post-SVR HCC between March, 2015 and October, 2021 from 30 sites in Europe, North America, South America, Middle East, South Asia, East Asia, and Southeast Asia. We compared patient demographics, liver dysfunction, and tumor burden by region. We compared overall survival by region using Kaplan-Meier analysis and identified factors associated with survival using multivariable Cox regression analysis.ResultsAmong 8,796 patients with advanced fibrosis or cirrhosis who achieved SVR, 583 (6.6%) developed incident HCC. There was marked regional variation in the proportion of detection by surveillance (range: 59.5-100%), median maximum tumor diameter (range: 1.8-5.0 cm), and proportion with multinodular HCC (range: 15.4-60.8%). Prognosis of patients highly varied by region (HR range: 1.82-9.92), with the highest survival in East Asia, North America, and South America, and lowest in the Middle East and South Asia. After adjusting for geographic region, HCC surveillance was associated with early-stage detection (BCLC stage 0/A: 71.0% vs. 21.3%, pConclusionsClinical characteristics, including early-stage detection, and prognosis of post-SVR HCC significantly differed across geographic regions. Surveillance utilization appears to be a high-yield intervention target to improve prognosis among patients with post-SVR HCC globally

    Strategies to Improve Immune Suppression Post-Liver Transplantation: A Review

    No full text
    Since the first liver transplantation operation (LT) in 1967 by Thomas Starzl, efforts to increase survival and prevent rejection have taken place. The development of calcineurin inhibitors (CNIs) in the 1980s led to a surge in survival post-transplantation, and since then, strategies to prevent graft loss and preserve long-term graft function have been prioritized. Allograft rejection is mediated by the host immune response to donor antigens. Prevention of rejection can be achieved through either immunosuppression or induction of tolerance. This leads to a clinical dilemma, as the choice of an immunosuppressive agent is not an easy task, with considerable patient and graft-related morbidities. On the other hand, the induction of graft tolerance remains a challenge. Despite the fact that the liver exhibits less rejection than any other transplanted organs, spontaneous graft tolerance is rare. Most immunosuppressive medications have been incriminated in renal, cardiovascular, and neurological complications, relapse of viral hepatitis, and recurrence of HCC and other cancers. Efforts to minimize immunosuppression are directed toward decreasing medication side effects, increasing cost effectiveness, and decreasing economic burden without increasing the risk of rejection. In this article, we will discuss recent advances in strategies for improving immunosuppression following liver transplantation

    Strategies to Improve Immune Suppression Post-Liver Transplantation: A Review

    No full text
    Since the first liver transplantation operation (LT) in 1967 by Thomas Starzl, efforts to increase survival and prevent rejection have taken place. The development of calcineurin inhibitors (CNIs) in the 1980s led to a surge in survival post-transplantation, and since then, strategies to prevent graft loss and preserve long-term graft function have been prioritized. Allograft rejection is mediated by the host immune response to donor antigens. Prevention of rejection can be achieved through either immunosuppression or induction of tolerance. This leads to a clinical dilemma, as the choice of an immunosuppressive agent is not an easy task, with considerable patient and graft-related morbidities. On the other hand, the induction of graft tolerance remains a challenge. Despite the fact that the liver exhibits less rejection than any other transplanted organs, spontaneous graft tolerance is rare. Most immunosuppressive medications have been incriminated in renal, cardiovascular, and neurological complications, relapse of viral hepatitis, and recurrence of HCC and other cancers. Efforts to minimize immunosuppression are directed toward decreasing medication side effects, increasing cost effectiveness, and decreasing economic burden without increasing the risk of rejection. In this article, we will discuss recent advances in strategies for improving immunosuppression following liver transplantation

    Dawn-to-dusk dry fasting decreases circulating inflammatory cytokines in subjects with increased body mass index

    No full text
    Background: The circadian rhythm involves numerous metabolic processes, including sleep/awakening, body temperature regulation, hormone secretion, hepatic function, cellular plasticity, and cytokine release (inflammation), that appear to have a dynamic relationship with all the processes above. Studies have linked various cytokines to the chronic state of low-grade inflammation and oxidative stress in obesity. Dawn-to-dusk dry fasting (DDDF) could alleviate the adverse effects of obesity by decreasing inflammation. This study examined the effects of DDDF on circulating inflammatory cytokines in subjects with increased body mass index (BMI). Methods: The current observational prospective study included adult subjects with a BMI equal to or greater than 25 kg/m2 who practiced the annual religious 30-day DDDF. Individuals with significant underlying medical conditions were excluded to limit confounding factors. All subjects were evaluated within two weeks before 30-day DDDF, within the fourth week of 30-day DDDF, and within two weeks after 30-day DDDF. Multiple cytokines and clinical health indicators were measured at each evaluation. Results: Thirteen subjects (10 men and three women) with a mean age of 32.9 years (SD = 9.7 years) and a mean BMI of 32 kg/m2 (SD = 4.6 kg/m2) were included. An overall associated decrease in the levels of multiple cytokines with DDDF was observed. A significant decrease in the mean interleukin 1 beta level was observed within the fourth week of 30-day DDDF (P = 0.045), which persisted even after the fasting period (P = 0.024). There was also a significant decrease in the mean levels of interleukin 15 (IL-15) (P = 0.014), interleukin 1 receptor antagonist (P = 0.041), macrophage-derived chemokine (MDC) (P = 0.013), and monokine induced by interferon gamma/chemokine (C-X-C motif) ligand 9 (P = 0.027) within the fourth week of 30-day DDDF and in the mean levels of fibroblast growth factor 2 (P = 0.010), interleukin 12 p40 subunit (P = 0.038), interleukin 22 (P = 0.025) and tumor necrosis factor alpha (P = 0.046) within two weeks after 30-DDDF. In terms of anthropometric parameters, there was a decrease in mean body weight (P = 0.032), BMI (P = 0.028), and hip circumference (P = 0.007) within the fourth week of 30-day DDDF and a decrease in mean weight (P = 0.026), BMI (P = 0.033) and hip circumference (P = 0.016) within two weeks after 30-day DDDF compared with the levels measured within two weeks before 30-day DDDF. Although there was no significant correlation between changes in weight and changes in circulating inflammatory cytokines, there was a significant positive correlation between changes in waist circumference and changes in specific inflammatory cytokines (e.g., IL-15, MDC, platelet-derived growth factor, soluble CD40L, vascular endothelial growth factor A) within the fourth week of 30-day DDDF and/or two weeks after 30-day DDDF. A significant decrease in mean average resting heart rate within the fourth week of 30-day DDDF was also observed (P = 0.023), and changes between average resting heart rate and changes in interleukin-8 levels within the fourth week of 30-day DDDF compared with baseline levels were positively correlated (r = 0.57, P = 0.042). Conclusion: DDDF appears to be a unique and potent treatment to reduce low-grade chronic inflammation caused by obesity and visceral adiposity. Further studies with more extended follow-up periods are warranted to investigate the long-term anti-inflammatory benefits of DDDF in individuals with increased BMI

    Posttransplant Outcome of Lean Compared With Obese Nonalcoholic Steatohepatitis in the United States: The Obesity Paradox

    No full text
    Morbid obesity is considered a relative contraindication for liver transplantation (LT). We investigated if body mass index (BMI; lean versus obese) is a risk factor for post-LT graft and overall survival in nonalcoholic steatohepatitis (NASH) and non-NASH patients. Using the United Network for Organ Sharing (UNOS) database, LT recipients from January 2002 to June 2013 (age ≥18 years) with follow-up until 2017 were included. The association of BMI categories calculated at LT with graft and overall survival after LT were examined. After adjusting for confounders, all obesity cohorts (overweight and class 1, class 2, and class 3 obesity) among LT recipients for NASH had significantly reduced risk of graft and patient loss at 10 years of follow-up compared with the lean BMI cohort. In contrast, the non-NASH group of LT recipients had no increased risk for graft and patient loss for overweight, class 1, and class 2 obesity groups but had significantly increased risk for graft (P \u3c 0.001) and patient loss (P = 0.005) in the class 3 obesity group. In this retrospective analysis of the UNOS database, adult recipients selected for first LT and NASH patients with the lowest BMI have the worse longterm graft and patient survival as opposed to non-NASH patients where the survival was worse with higher BMI
    corecore