12 research outputs found
A genomic signature for accurate classification and prediction of clinical outcomes in cancer patients treated with immune checkpoint blockade immunotherapy
Tumor mutational burden (TMB) is associated with clinical response to immunotherapy, but application has been limited to a subset of cancer patients. We hypothesized that advanced machine-learning and proper modeling could identify mutations that classify patients most likely to derive clinical benefits. Training data: Two sets of public whole-exome sequencing (WES) data for metastatic melanoma. Validation data: One set of public non-small cell lung cancer (NSCLC) data. Least Absolute Shrinkage and Selection Operator (LASSO) machine-learning and proper modeling were used to identify a set of mutations (biomarker) with maximum predictive accuracy (measured by AUROC). Kaplan-Meier and log-rank methods were used to test prediction of overall survival. The initial model considered 2139 mutations. After pruning, 161 mutations (11%) were retained. An optimal threshold of 0.41 divided patients into high-weight (HW) or low-weight (LW) TMB groups. Classification for HW-TMB was 100% (AUROC = 1.0) on melanoma learning/testing data; HW-TMB was a prognostic marker for longer overall survival. In validation data, HW-TMB was associated with survival (p = 0.0057) and predicted 6-month clinical benefit (AUROC = 0.83) in NSCLC. In conclusion, we developed and validated a 161-mutation genomic signature with outstanding 100% accuracy to classify melanoma patients by likelihood of response to immunotherapy. This biomarker can be adapted for clinical practice to improve cancer treatment and care
Fetal and early postnatal lead exposure measured in teeth associates with infant gut microbiota
BACKGROUND: Lead (Pb) is an environmentally ubiquitous heavy metal associated with a wide range of adverse health effects in children. Both lead exposure and the early life microbiome- which plays a critical role in human development-have been linked to similar health outcomes, but it is unclear if the adverse effects of lead are partially driven by early life gut microbiota dysbiosis. The objective of this study was to examine the association between in utero and postnatal lead levels (measured in deciduous baby teeth) and early life bacterial and fungal gut microbiota in the first year of life.
METHODS: Data from the Wayne County Health, Environment, Allergy and Asthma Longitudinal Study (WHEALS) birth cohort were analyzed. Tooth lead levels during the 2nd and 3rd trimesters and postnatally (age) were quantified using high-resolution microspatial mapping of dentin growth rings. Early life microbiota were measured in stool samples collected at approximately 1 and 6 months of age, using both 16S rRNA (bacterial) and ITS2 (fungal) sequencing. Of the 1,258 maternal-child pairs in WHEALS, 146 had data on both tooth metals and early life microbiome.
RESULTS: In utero tooth lead levels were significantly associated with gut fungal community composition at 1-month of age, where higher levels of 2nd trimester tooth lead was associated with lower abundances of Candida and Aspergillus and higher abundances of Malassezia and Saccharomyces; 3rd trimester lead was also associated with lower abundances of Candida. Though lead did not significantly associate with the overall structure of the infant gut bacterial community, it associated with the abundance of some specific bacterial taxa, including the increased abundance of Collinsella and Bilophila and a decreased abundance of Bacteroides taxa.
CONCLUSIONS: The observed associations between lead exposure and infant gut microbiota could play a role in the impact of lead on childhood development. Given the paucity of research examining these associations in humans-particularly for fungal microbiota-further investigation is needed
Administration of Downstream ApoE Attenuates the Adverse Effect of Brain ABCA1 Deficiency on Stroke
The ATP-binding cassette transporter member A1 (ABCA1) and apolipoprotein E (ApoE) are major cholesterol transporters that play important roles in cholesterol homeostasis in the brain. Previous research demonstrated that specific deletion of brain-ABCA1 (ABCA1−B/−B) reduced brain grey matter (GM) and white matter (WM) density in the ischemic brain and decreased functional outcomes after stroke. However, the downstream molecular mechanism underlying brain ABCA1-deficiency-induced deficits after stroke is not fully understood. Adult male ABCA1−B/−B and ABCA1-floxed control mice were subjected to distal middle-cerebral artery occlusion and were intraventricularly infused with artificial mouse cerebrospinal fluid as vehicle control or recombinant human ApoE2 into the ischemic brain starting 24 h after stroke for 14 days. The ApoE/apolipoprotein E receptor 2 (ApoER2)/high-density lipoprotein (HDL) levels and GM/WM remodeling and functional outcome were measured. Although ApoE2 increased brain ApoE/HDL levels and GM/WM density, negligible functional improvement was observed in ABCA1-floxed-stroke mice. ApoE2-administered ABCA1−B/−B stroke mice exhibited elevated levels of brain ApoE/ApoER2/HDL, increased GM/WM density, and neurogenesis in both the ischemic ipsilateral and contralateral brain, as well as improved neurological function compared with the vehicle-control ABCA1−B/−B stroke mice 14 days after stroke. Ischemic lesion volume was not significantly different between the two groups. In vitro supplementation of ApoE2 into primary cortical neurons and primary oligodendrocyte-progenitor cells (OPCs) significantly increased ApoER2 expression and enhanced cholesterol uptake. ApoE2 promoted neurite outgrowth after oxygen-glucose deprivation and axonal outgrowth of neurons, and increased proliferation/survival of OPCs derived from ABCA1−B/−B mice. Our data indicate that administration of ApoE2 minimizes the adverse effects of ABCA1 deficiency after stroke, at least partially by promoting cholesterol traffic/redistribution and GM/WM remodeling via increasing the ApoE/HDL/ApoER2 signaling pathway
The role of GDF15 (growth/differentiation factor 15) during prostate carcinogenesis
GDF15 (growth/differentiation factor 15), also known as MIC-1 (Macrophage inhibitory cytokine 1), is a divergent member of the TGFβ superfamily of cytokines and is highly expressed in prostate tumors, but its role in prostate carcinogenesis and utility as a prognostic biomarker is unclear. We studied 91 prostate cancer cases that underwent surgery as their primary treatment and were then subsequently followed for biochemical recurrence (BCR). These cases also had a benign prostate biopsy at least one year before their prostate cancer diagnosis. In both the benign biopsy and tumor specimens, we quantified the intensity of GDF15 expression and characterized the presence of tumor associated macrophages by measuring the density of CD68-positive stained cells and the M2 macrophage marker CD204 by immunohistochemical analysis. Marker expression was measured in a) benign biopsy, b) tumor-adjacent benign and c) tumor tissue using an automated multi-image processing macro developed in the ImageJ software. Expression measurements were log2 transformed and high-low cut-off points were selected that optimized the association of biomarker expression with BCR-free survival. A Cox proportional hazards model was used to test the association of time to BCR with low vs high biomarker expression. During follow-up, 23 cases (25.2%) experienced BCR (96% of men without BCR had at least one year of follow-up). An increased hazard ratio (HR) for BCR was found in men with a higher ratio of GDF15 expression in their tumor vs. tumor-adjacent benign tissue (HR: 3.74; 95% confidence interval (CI) = 1.27-10.99) Adjusting for tumor grade, pathological tumor stage and PSA at diagnosis did not alter risk estimates significantly. In these same prostate tumor specimens, increased hazard ratios for BCR were found among men who had elevated CD204 expression in tumor (HR = 5.24; 95% CI = 2.02-13.62) and in tumor-adjacent benign tissue (HR = 3.29; 95% CI = 1.32, 8.23). We found no association of BCR-free survival with either GDF15 or CD204 expression in pre-diagnostic benign biopsies. Our results suggest that men who have a larger difference in GDF15 expression levels between prostate tumor and tumor-adjacent benign tissue, and with increased levels of M2 macrophages in both tumor and tumor-adjacent benign tissue, are at greater risk of disease recurrence. Further evaluation of the differences in the prostate immune cellular profile in the pre-malignant and malignant state may offer additional insight into inflammatory mediated prostate carcinogenesis
Hepatitis C patients with HIV co-infection demonstrate unique liver-related complications and health behaviors compared to HCV mono-infected patients
Background: Most studies of hepatitis C (HCV) and HIV co-infection focus on HIV cohorts, and may not collect data regarding liver-related outcomes. We used data from the Chronic Hepatitis Cohort Study to investigate the impact of HIV on clinical characteristics and mental health in HCV patients. Methods: Patient demographics and clinical status were collected from the electronic health record from date of diagnosis (HCV or HIV/ HCV co-infection) onward. A subgroup provided survey data regarding health-related behaviors and mental health. Chi-square tests and two-sample t-testswere used to compare categorical and continuous variables, respectively, between mono- and co-infected patients. Results: Among 14545 patients, 584 (4.0%) were HIV co-infected. Compared to mono-infected patients, co-infected patients were significantly younger and more likely to be male, African American, low-income, and publicly insured; less likely to see a liver specialist or receive HCV treatment; and demonstrated increased comorbidities, fibrosis/ cirrhosis, hepatocellular carcinoma, and mortality. Among 5008 survey respondents (4885 HCV; 123 HIV/HCV), coinfected patients were significantly more likely to report drug/ alcohol use and sex with multiple partners. Conclusion: HIV co-infection impacts a demographically distinct subset of HCV patients. Despite high rates of HIV treatment, coinfected patients were less likely to see a liver specialist or receive HCV-specific treatment than HCV mono-infected patients. Co-infected patients also demonstrated increased rates of liver complications and mortality, as well as high-risk behaviors
HIV co-infection is associated with increased liver complications and reduced mental health among patients with chronic hepatitis B
Background and Aims: Most studies of hepatitis B (HBV) and HIV coinfection focus on HIV cohorts, and thus may not collect data regarding HBV-related treatment or outcomes.We used data from the Chronic Hepatitis Cohort Study (CHeCS)-a racially- and geographically- diverse sample from four large US health systems-to investigate the impact of HIV on the clinical characteristics and mental health of patients with chronic hepatitis B. Method: Patient demographics and clinical status were collected from the electronic health record from date of diagnosis (HBV or HIV/ HBV co-infection) onward. A subgroup provided survey data regarding health-related behaviors and mental health. Chi-square tests and two-sample t-tests were used to compare categorical and continuous variables, respectively, between HBV mono-infected and HIV/HBV coinfected patients. Results: Among a sample of 4640 HBV patients, 300 (6.5%) were HIV co-infected. HIV/HBV co-infected patients were significantly more likely to be male, African American or white, low-income, and publicly insured than HBV mono-infected patients. Despite high rates of antiviral treatment, co-infected patients also demonstrated increased comorbidities, fibrosis/ cirrhosis, and mortality. Among a subgroup of 980 survey respondents (913 HBV, 67 HIV/HBV), coinfected patients were significantly more likely to be depressed, to have lower mental health scores, and to report being “highly stressed”. Conclusion: HIV co-infection impacts a demographically distinct subset of HBV patients. Despite high rates of antiviral treatment, coinfected patients demonstrate increased rates of liver-related complications and report poor mental health outcomes
Comparative analysis of direct-acting antiviral regimens across Hepatitis C genotypes and clinical characteristics among patients under routine clinical care in the US.
Background: We used data from the Chronic Hepatitis Cohort study (CHeCS) to compare the efficacy of direct acting antiviral (DAA) regimens among “real world” patients whose clinical status, hepatitis C virus (HCV) genotype (GT), and treatment history may vary from those of previous reports. Methods: Multivariate analysis of achieving sustained virologic response (SVR) among 3612 HCV patients who received at least 12 weeks of treatment with one of eight DAA regimens through 2017: Generation 1: sofosbuvir (SOF) with ribavirin (RBV); Generation 2, with and without RBV: daclatasvir +SOF, grezoprevir +elbasvir, paritoprevir +ritonavir +ombitasvir (with and without dasabuvir), simeprevir +SOF, and SOF +ledipasvir; and Generation 3, with and without RBV: SOF +velpatasavir. Propensity scores were used to adjust for adjuvant RBV treatment selection‐bias. Results: The overall observed SVR rate for 12‐week regimens was 95%. However, odds of achieving SVR varied significantly by sex (adjusted odds ratio [aOR] for male vs female patients=0.70, 95%CI 0.56–0.87); use of proton pump inhibitors (aOR=0.72, 95%CI 0.56–0.93); diabetes (aOR=0.42, 95%CI 0.34–0.52); and cirrhosis status (aOR=0.35, 95%CI 0.25–0.49 and 0.73, 95%CI 0.59–0.91 for decompensated and compensated cirrhosis, respectively, compared to no cirrhosis). Patients with GT3 had significantly lower odds of SVR than those with GT1 (aOR=0.27, 95%CI 0.15–0.52) and GT2 (aOR=0.09, 95%CI 0.05–0.19). Previous DAA treatment failure reduced odds of SVR (aOR=0.13, 95%CI 0.08–0.20). Each one month increment of DAA treatment increased odds of SVR by 90% (aOR=1.94, 95%CI 1.73–2.16). With adjuvant RBV, each subsequent generation demonstrated better efficacy (Gen1 vs 2: aOR=0.32, 95%CI 0.21–0.47; Gen1 vs 3: aOR=0.03, 95%CI 0.01–0.11; Gen2 vs 3: aOR=0.10, 95%CI 0.03–0.32), but there was no difference between Gen2 and Gen3 in the absence of RBV (aOR=1.11, 95%CI 0.67–1.83). Use of RBV doubled the odds of SVR among patients with decompensated cirrhosis (aOR=2.08, 95%CI 1.13–3.85). Conclusion: Despite high rates of successful treatment response, patients who failed prior DAA treatment, and those with cirrhosis or decompensated cirrhosis, remain at increased risk of treatment failure. Increased treatment duration (among all patients) and the use of RBV (among patients with decompensated cirrhosis) independently improved the odds of SVR. The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention
Inadequate response to UDCA among PBC patients under routine care in the US: Rising serum bilirubin even in the normal range is a risk factor and subsequent clinical follow-up differs based on treatment response
Background and aims: Ursodeoxycholic acid (UDCA)is a first line treatment in patients with primary biliary cholangitis (PBC)that is often followed by second-line therapy if there is inadequate response (IR). Previous analyses by the Fibrotic Liver Disease Consortium showed that pre-treatment total bilirubin, even within the normal range, is associated with increased risk of mortality. In the present study, we analyzed the effect of pre-treatment bilirubin and other covariates associated with the risk of IR, and compared follow-up care between patients with/without IR. Method: Baseline data were collected for PBC patients at time of UDCA initiation between 2006 and 2015. Total bilirubin was categorized as \u3e 2, 2 \u3e 1.5, 1.5 \u3e 1.0, 1.0 \u3e 0.7, 0.7 \u3e 0.4, and ≤ 0.4 mg/dL. IR was defined using Paris II criteria 12 months after UDCA initiation. Logistic regression was used to estimate the adjusted risk for IR; model accuracy was assessed using area under the receiver operator characteristic curve (AUROC). Z-statistic was used to compare rates of follow-up care and treatment modification per person-year (PPY)between IR and non-IR patients. Results: Among 1578 UDCA treated patients (13% men; 8% African American, 9% Asian American/American Indian/Pacific Islander (ASINPI); 25% Hispanic), 706 (45%)had IR to UDCA at 12 months post-baseline. The multivariate model (AUROC = 0.79)showed that younger age, increasing alkaline phosphatase (ALP), low albumin, and a ratio of aspartate to alanine aminotransferase (AST/ALT)\u3e 1.1 were independently associated with an increased risk of IR. Bilirubin—even in the high-normal (1.0 \u3e 0.7)and mid-normal (0.7 \u3e 0.4 mg/dL)ranges—was also significantly associated with increased risk of IR compared to low-normal levels (≤ 0.4 mg/dL; Figure). A sensitivity analysis that defined IR as ALp \u3e 1.67xULN yielded similar results. Compared to responders, patients with IR were more likely to: discontinue UDCA (0.08 vs 0.04 PPY; p \u3c 0.01); add obeticholic acid (0.023 vs 0.004 PPY; p \u3c 0.01); and were more likely to see a specialist (5.12 vs 3.16 visits PPY; p \u3c 0.01), undergo liver imaging (1.23 vs 0.56 tests PPY; p \u3c 0.01), have liver-related laboratory testing (18.4 vs 10.2 tests PPY; p \u3c 0.01), be hospitalized (0.11 vs 0.07 PPY; p \u3c 0.01), and seek emergency care (0.13 vs 0.08 PPY; p \u3c 0.01). Conclusion: Almost half of PBC patients (45%)in a routine clinical care cohort showed IR to UDCA. Baseline bilirubin \u3e 0.4 mg/dL is associated with increasing risk of IR. Patients with IR had higher rates of specialist follow-up and health care utilization
Dynamic Risk Prediction of Response to Ursodeoxycholic Acid Among Patients with Primary Biliary Cholangitis in the USA
BACKGROUND: Ursodeoxycholic acid (UDCA) remains the first-line therapy for primary biliary cholangitis (PBC); however, inadequate treatment response (ITR) is common. The UK-PBC Consortium developed the modified UDCA Response Score (m-URS) to predict ITR (using alkaline phosphatase [ALP] \u3e 1.67 times the upper limit of normal [*ULN]) at 12 months post-UDCA initiation). Using data from the US-based Fibrotic Liver Disease Consortium, we assessed the m-URS in our multi-racial cohort. We then used a dynamic modeling approach to improve prediction accuracy.
METHODS: Using data collected at the time of UDCA initiation, we assessed the m-URS using the original formula; then, by calibrating coefficients to our data, we also assessed whether it remained accurate when using Paris II criteria for ITR. Next, we developed and validated a dynamic risk prediction model that included post-UDCA initiation laboratory data.
RESULTS: Among 1578 patients (13% men; 8% African American, 9% Asian American/American Indian/Pacific Islander; 25% Hispanic), the rate of ITR was 27% using ALP \u3e 1.67*ULN and 45% using Paris II criteria. M-URS accuracy was very good (AUROC = 0.87, sensitivity = 0.62, and specificity = 0.82) for ALP \u3e 1.67*ULN and moderate (AUROC = 0.74, sensitivity = 0.57, and specificity = 0.70) for Paris II. Our dynamic model significantly improved accuracy for both definitions of ITR (ALP \u3e 1.67*ULN: AUROC = 0.91; Paris II: AUROC = 0.81); specificity approached 100%. Roughly 9% of patients in our cohort were at the highest risk of ITR.
CONCLUSIONS: Early identification of patients who will not respond to UDCA treatment using a dynamic prediction model based on longitudinal, repeated risk factor measurements may facilitate earlier introduction of adjuvant treatment