11 research outputs found

    Elevated serum liver-type fatty acid binding protein levels in non-acetaminophen acute liver failure patients with organ dysfunction.

    Full text link
    Liver-type fatty acid binding protein (FABP1) has previously been demonstrated to improve prognostic discrimination in acetaminophen (APAP)-induced ALF but has not been investigated in other etiologies of ALF. AIM: To determine whether FABP1 levels (early: admission or late: days 3-5) are associated with 21-day transplant-free survival in non-APAP ALF. METHODS: FABP1 was measured in serum samples from 384 ALF patients (n = 88 transplant-free survivors (TFS), n = 296 died/LT-NTFS) using solid-phase enzyme-linked immunosorbent assay and analyzed with US ALFSG registry data. RESULTS: Of 384 ALF patients (autoimmune hepatitis n = 125, drug-induced liver injury n = 141, Hepatitis B n = 118), 177 (46%) patients received LT. Early FABP1 levels were significantly higher in ALF patients requiring vasopressor support (203.4 vs. 76.3 ng/mL) and renal replacement therapy (203.4 vs. 78.8 ng/mL; p < 0.001 for both). Late FABP1 levels were significantly higher in patients requiring mechanical ventilation (77.5 vs. 53.3 ng/mL), vasopressor support (116.4 vs. 53.3 ng/mL) and in patients with grade 3/4 hepatic encephalopathy (71.4 vs. 51.4 ng/mL; p = 0.03 for all). Late FABP1 levels were significantly lower in TFS patients (TFS 54 vs. NTFS 66 ng/mL; p = 0.049) but not admission (TFS 96 vs. NTFS 87 ng/mL; p = 0.67). After adjusting for significant covariates, serum FABP1 did not discriminate significantly between TFS and patients who died/received LT at day 21 either on admission (p = 0.29) or late (days 3-5, p = 0.087) time points. CONCLUSION: In this first report of FABP1 in non-APAP ALF, FABP1 levels at late time points (days 3-5) were significantly lower in ALF patients who were alive without transplant at day 21 but not after adjusting for covariates reflecting severity of illness. Higher FABP1 levels were associated with the presence of increased organ failure

    Accelerated and interpretable oblique random survival forests

    Full text link
    The oblique random survival forest (RSF) is an ensemble supervised learning method for right-censored outcomes. Trees in the oblique RSF are grown using linear combinations of predictors to create branches, whereas in the standard RSF, a single predictor is used. Oblique RSF ensembles often have higher prediction accuracy than standard RSF ensembles. However, assessing all possible linear combinations of predictors induces significant computational overhead that limits applications to large-scale data sets. In addition, few methods have been developed for interpretation of oblique RSF ensembles, and they remain more difficult to interpret compared to their axis-based counterparts. We introduce a method to increase computational efficiency of the oblique RSF and a method to estimate importance of individual predictor variables with the oblique RSF. Our strategy to reduce computational overhead makes use of Newton-Raphson scoring, a classical optimization technique that we apply to the Cox partial likelihood function within each non-leaf node of decision trees. We estimate the importance of individual predictors for the oblique RSF by negating each coefficient used for the given predictor in linear combinations, and then computing the reduction in out-of-bag accuracy. In general benchmarking experiments, we find that our implementation of the oblique RSF is approximately 450 times faster with equivalent discrimination and superior Brier score compared to existing software for oblique RSFs. We find in simulation studies that 'negation importance' discriminates between relevant and irrelevant predictors more reliably than permutation importance, Shapley additive explanations, and a previously introduced technique to measure variable importance with oblique RSFs based on analysis of variance. Methods introduced in the current study are available in the aorsf R package.Comment: 40 pages, 6 figure

    Infection of CD8+CD45RO+ Memory T-Cells by HIV-1 and Their Proliferative Response

    Get PDF
    CD8+ T-cells are involved in controlling HIV-1 infection by eliminating infected cells and secreting soluble factors that inhibit viral replication. To investigate the mechanism and significance of infection of CD8+ T-cells by HIV-1 in vitro, we examined the susceptibility of these cells and their subsets to infection. CD8+ T-cells supported greater levels of replication with T-cell tropic strains of HIV-1, though viral production was lower than that observed in CD4+ T-cells. CD8+ T-cell infection was found to be productive through ELISA, RT-PCR and flow cytometric analyses. In addition, the CD8+CD45RO+ memory T-cell population supported higher levels of HIV-1 replication than CD8+CD45RA+ naïve T-cells. However, infection of CD8+CD45RO+ T-cells did not affect their proliferative response to the majority of mitogens tested. We conclude, with numerous lines of evidence detecting and measuring infection of CD8+ T-cells and their subsets, that this cellular target and potential reservoir may be central to HIV-1 pathogenesis

    Predicting in-hospital mortality in pneumonia-associated septic shock patients using a classification and regression tree: a nested cohort study

    No full text
    Abstract Background Pneumonia complicated by septic shock is associated with significant morbidity and mortality. Classification and regression tree methodology is an intuitive method for predicting clinical outcomes using binary splits. We aimed to improve the prediction of in-hospital mortality in patients with pneumonia and septic shock using decision tree analysis. Methods Classification and regression tree models were applied to all patients with pneumonia-associated septic shock in the international, multicenter Cooperative Antimicrobial Therapy of Septic Shock database between 1996 and 2015. The association between clinical factors (time to appropriate antimicrobial therapy, severity of illness) and in-hospital mortality was evaluated. Accuracy in predicting clinical outcomes, sensitivity, specificity, and area under receiver operating curve of the final model was evaluated in training (n = 2111) and testing datasets (n = 2111). Results The study cohort contained 4222 patients, and in-hospital mortality was 51%. The mean time from onset of shock to administration of appropriate antimicrobials was significantly higher for patients who died (17.2 h) compared to those who survived (5.0 h). In the training dataset (n = 2111), a tree model using Acute Physiology and Chronic Health Evaluation II Score, lactate, age, and time to appropriate antimicrobial therapy yielded accuracy of 73% and area under the receiver operating curve 0.75. The testing dataset (n = 2111) had accuracy of 69% and area under the receiver operating curve 0.72. Conclusions Overall mortality (51%) in patients with pneumonia complicated by septic shock is high. Increased time to administration of antimicrobial therapy, Acute Physiology and Chronic Health Evaluation II Score, serum lactate, and age were associated with increased in-hospital mortality. Classification and regression tree methodology offers a simple prognostic model with good performance in predicting in-hospital mortality

    A novel microRNA-based prognostic model outperforms standard prognostic models in patients with acetaminophen-induced acute liver failure.

    No full text
    BACKGROUND & AIMS: Acetaminophen (APAP)-induced acute liver failure (ALF) remains the most common cause of ALF in the Western world. Conventional prognostic models, utilising markers of liver injury and organ failure, lack sensitivity for mortality prediction. We previously identified a microRNA signature that is associated with successful regeneration post-auxiliary liver transplant and with recovery from APAP-ALF. Herein, we aimed to use this microRNA signature to develop outcome prediction models for APAP-ALF. METHODS: We undertook a nested, case-control study using serum samples from 194 patients with APAP-ALF enrolled in the US ALF Study Group registry (1998-2014) at early (day 1-2) and late (day 3-5) time-points. A microRNA qPCR panel of 22 microRNAs was utilised to assess microRNA expression at both time-points. Multiple logistic regression was used to develop models which were compared to conventional prognostic models using the DeLong method. RESULTS: Individual microRNAs confer limited prognostic value when utilised in isolation. However, incorporating them within microRNA-based outcome prediction models increases their clinical utility. Our early time-point model (AUC = 0.78, 95% CI 0.71-0.84) contained a microRNA signature associated with liver regeneration and our late time-point model (AUC = 0.83, 95% CI 0.76-0.89) contained a microRNA signature associated with cell-death. Both models were enhanced when combined with model for end-stage liver disease (MELD) score and vasopressor use and both outperformed the King\u27s College criteria. The early time-point model combined with clinical parameters outperformed the ALF Study Group prognostic index and the MELD score. CONCLUSIONS: Our findings demonstrate that a regeneration-linked microRNA signature combined with readily available clinical parameters can outperform existing prognostic models for ALF in identifying patients with poor prognosis who may benefit from transplantation. LAY SUMMARY: While acute liver failure can be reversible, some patients will die without a liver transplant. We show that blood test markers that measure the potential for liver recovery may help improve identification of patients unlikely to survive acute liver failure who may benefit from a liver transplant
    corecore