42 research outputs found

    A PROTEOMIC STUDY OF OXIDATIVE STRESS IN ALCOHOLIC LIVER DISEASE

    Get PDF
    Alcoholic steatosis (AS) is the initial pathology associated with early stage alcoholic liver disease and is characterized by the accumulation of fat in the liver. AS is considered clinically benign as it is reversible, as compared with alcoholic steatohepatitis (ASH) which is the next stage of alcoholic liver disease (ALD), and mostly irreversible. Proteomics were used to investigate the molecular basis of AS to determine biomarkers representative of AS. Liver tissue proteins at different stages of steatosis from a rodent model of AS were separated by two dimensional electrophoresis (2DE), followed by MALDI mass spectrometry (MS) identification of significantly expressed proteins. Expression levels of several proteins related to alcohol induced oxidative stress, such as peroxiredoxin 6 (PRDX6) and aldehyde dehydrogenase 2 (ALDH2) were reduced by 2 to 3-fold in ethanol fed rats, and suggested an increase in oxidative stress. Several proteins involved in fatty acid and amino acid metabolism were found at increased expression levels, suggesting higher energy demand upon chronic exposure to ethanol. In order to delineate between the effects of fat accumulation and oxidative stress, an in vitro hepatocyte cell culture model of steatosis was developed. HepG2 cells loaded with oleic acid surprisingly demonstrated lower cytotoxicity upon oxidative challenge (based on lactate dehydrogenase activity) and inflammation (based on TNF-? induced activation of the pro-inflammatory transcription factor NF-?B). We also examined the effect of oleic acid loading in HepG2 cells on protein carbonylation, which is an important irreversible protein modification during oxidative stress that leads to protein dysfunction and disease. Fat-loaded hepatocytes exposed to oxidative stress with tert-butyl hydroperoxide (TBHP) contained 17% less carbonylated proteins than the non-fat loaded control. Mass spectrometric analysis of carbonylated proteins indicated that known classical markers of protein carbonylation (e.g., cytoskeletal proteins, chaperones) are not carbonylated in oleic acid loaded HepG2 cells, and suggests that the protective effect of fat loading is through interference with protein carbonylation. While counterintuitive to the general concept that AS increases oxidative stress, our fat loading results suggests that low levels of fat may activate antioxidant pathways and ameliorate the effect of subsequent oxidative or inflammatory challenge

    Trends in Kaposi's sarcoma-associated Herpesvirus antibodies prior to the development of HIV-associated Kaposi's sarcoma: a nested case-control study

    Get PDF
    HIV-associated Kaposi's sarcoma (KS) is a public health challenge in sub-Saharan Africa since both the causative agent, Kaposi's sarcoma associated-herpesvirus (KSHV), and the major risk factor, HIV, are prevalent. In a nested case-control study within a long-standing clinical cohort in rural Uganda, we used stored sera to examine the evolution of antibody titres against the KSHV antigens K8.1 and latency-associated nuclear antigen (LANA) among 30 HIV-infected subjects who subsequently developed HIV-related KS (cases) and among 108 matched HIV/KSHV coinfected controls who did not develop KS. Throughout the 6 years prior to diagnosis, antibody titres to K8.1 and LANA were significantly higher among cases than controls (p < 0.0001), and titres increased prior to diagnosis in the cases. K8.1 titres differed more between KS cases and controls, compared to LANA titres. These differences in titre between cases and controls suggest a role for lytic viral replication in the pathogenesis of HIV-related KS in this setting

    Trends in Kaposi's sarcoma-associated Herpesvirus antibodies prior to the development of HIV-associated Kaposi's sarcoma: a nested case-control study.

    Get PDF
    HIV-associated Kaposi's sarcoma (KS) is a public health challenge in sub-Saharan Africa since both the causative agent, Kaposi's sarcoma associated-herpesvirus (KSHV), and the major risk factor, HIV, are prevalent. In a nested case-control study within a long-standing clinical cohort in rural Uganda, we used stored sera to examine the evolution of antibody titres against the KSHV antigens K8.1 and latency-associated nuclear antigen (LANA) among 30 HIV-infected subjects who subsequently developed HIV-related KS (cases) and among 108 matched HIV/KSHV coinfected controls who did not develop KS. Throughout the 6 years prior to diagnosis, antibody titres to K8.1 and LANA were significantly higher among cases than controls (p < 0.0001), and titres increased prior to diagnosis in the cases. K8.1 titres differed more between KS cases and controls, compared to LANA titres. These differences in titre between cases and controls suggest a role for lytic viral replication in the pathogenesis of HIV-related KS in this setting

    CRL4^(AMBRA1) targets Elongin C for ubiquitination and degradation to modulate CRL5 signaling

    Get PDF
    Multi‐subunit cullin‐RING ligases (CRLs) are the largest family of ubiquitin E3 ligases in humans. CRL activity is tightly regulated to prevent unintended substrate degradation or autocatalytic degradation of CRL subunits. Using a proteomics strategy, we discovered that CRL4^(AMBRA1) (CRL substrate receptor denoted in superscript) targets Elongin C (ELOC), the essential adapter protein of CRL5 complexes, for polyubiquitination and degradation. We showed that the ubiquitin ligase function of CRL4^(AMBRA1) is required to disrupt the assembly and attenuate the ligase activity of human CRL5^(SOCS3) and HIV‐1 CRL5^(VIF) complexes as AMBRA1 depletion leads to hyperactivation of both CRL5 complexes. Moreover, CRL4^(AMBRA1) modulates interleukin‐6/STAT3 signaling and HIV‐1 infectivity that are regulated by CRL5^(SOCS3) and CRL5^(VIF), respectively. Thus, by discovering a substrate of CRL4^(AMBRA1), ELOC, the shared adapter of CRL5 ubiquitin ligases, we uncovered a novel CRL cross‐regulation pathway

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe

    Haematological consequences of acute uncomplicated falciparum malaria: a WorldWide Antimalarial Resistance Network pooled analysis of individual patient data

    Get PDF
    Background: Plasmodium falciparum malaria is associated with anaemia-related morbidity, attributable to host, parasite and drug factors. We quantified the haematological response following treatment of uncomplicated P. falciparum malaria to identify the factors associated with malarial anaemia. Methods: Individual patient data from eligible antimalarial efficacy studies of uncomplicated P. falciparum malaria, available through the WorldWide Antimalarial Resistance Network data repository prior to August 2015, were pooled using standardised methodology. The haematological response over time was quantified using a multivariable linear mixed effects model with nonlinear terms for time, and the model was then used to estimate the mean haemoglobin at day of nadir and day 7. Multivariable logistic regression quantified risk factors for moderately severe anaemia (haemoglobin < 7 g/dL) at day 0, day 3 and day 7 as well as a fractional fall ≥ 25% at day 3 and day 7. Results: A total of 70,226 patients, recruited into 200 studies between 1991 and 2013, were included in the analysis: 50,859 (72.4%) enrolled in Africa, 18,451 (26.3%) in Asia and 916 (1.3%) in South America. The median haemoglobin concentration at presentation was 9.9 g/dL (range 5.0–19.7 g/dL) in Africa, 11.6 g/dL (range 5.0–20.0 g/dL) in Asia and 12.3 g/dL (range 6.9–17.9 g/dL) in South America. Moderately severe anaemia (Hb < 7g/dl) was present in 8.4% (4284/50,859) of patients from Africa, 3.3% (606/18,451) from Asia and 0.1% (1/916) from South America. The nadir haemoglobin occurred on day 2 post treatment with a mean fall from baseline of 0.57 g/dL in Africa and 1.13 g/dL in Asia. Independent risk factors for moderately severe anaemia on day 7, in both Africa and Asia, included moderately severe anaemia at baseline (adjusted odds ratio (AOR) = 16.10 and AOR = 23.00, respectively), young age (age < 1 compared to ≥ 12 years AOR = 12.81 and AOR = 6.79, respectively), high parasitaemia (AOR = 1.78 and AOR = 1.58, respectively) and delayed parasite clearance (AOR = 2.44 and AOR = 2.59, respectively). In Asia, patients treated with an artemisinin-based regimen were at significantly greater risk of moderately severe anaemia on day 7 compared to those treated with a non-artemisinin-based regimen (AOR = 2.06 [95%CI 1.39–3.05], p < 0.001). Conclusions: In patients with uncomplicated P. falciparum malaria, the nadir haemoglobin occurs 2 days after starting treatment. Although artemisinin-based treatments increase the rate of parasite clearance, in Asia they are associated with a greater risk of anaemia during recovery

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore