14 research outputs found

    Effects of Separate and Concomitant TLR-2 and TLR-4 Activation in Peripheral Blood Mononuclear Cells of Newborn and Adult Horses

    Get PDF
    Deficient innate and adaptive immune responses cause newborn mammals to be more susceptible to bacterial infections than adult individuals. Toll-like receptors (TLRs) are known to play a pivotal role in bacterial recognition and subsequent immune responses. Several studies have indicated that activation of certain TLRs, in particular TLR-2, can result in suppression of inflammatory pathology. In this study, we isolated peripheral blood mononuclear cells (PBMCs) from adult and newborn horses to investigate the influence of TLR-2 activation on the inflammatory response mediated by TLR-4. Data were analysed in a Bayesian hierarchical linear regression model, accounting for variation between horses. In general, cytokine responses were lower in PBMCs derived from foals compared with PBMCs from adult horses. Whereas in foal PBMCs expression of TLR-2, TLR-4, and TLR-9 was not influenced by separate and concomitant TLR-2 and TLR-4 activation, in adult horse PBMCs, both TLR ligands caused significant up-regulation of TLR-2 and down-regulation of TLR-9. Moreover, in adult horse PBMCs, interleukin-10 protein production and mRNA expression increased significantly following concomitant TLR-2 and TLR-4 activation (compared with sole TLR-4 activation). In foal PBMCs, this effect was not observed. In both adult and foal PBMCs, the lipopolysaccharide-induced pro-inflammatory response was not influenced by pre-incubation and co-stimulation with the specific TLR-2 ligand Pam3-Cys-Ser-Lys4. This indicates that the published data on other species cannot be translated directly to the horse, and stresses the necessity to confirm results obtained in other species in target animals. Future research should aim to identify other methods or substances that enhance TLR functionality and bacterial defence in foals, thereby lowering susceptibility to life-threatening infections during the first period of life

    The association between Toll-like receptor 2 single-nucleotide polymorphisms and hepatocellular carcinoma susceptibility

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Toll-like receptors (TLR) are key innate immunity receptors participating in an immune response. Growing evidence suggests that mutations of TLR2/TLR9 gene are associated with the progress of cancers. The present study aimed to investigate the temporal relationship of single nucleotide polymorphisms (SNP) of TLR2/TLR9 and the risk of hepatocellular carcinoma (HCC).</p> <p>Methods</p> <p>In this single center-based case-control study, SNaPshot method was used to genotype sequence variants of TLR2 and TLR9 in 211 patients with HCC and 232 subjects as controls.</p> <p>Results</p> <p>Two synonymous SNPs in the exon of TLR2 were closely associated with risk of HCC. Compared with those carrying wild-type homozygous genotypes (T/T), risk of HCC decreased significantly in individuals carrying the heterozygous genotypes (C/T) of the rs3804099 (adjusted odds ratio (OR), 0.493, 95% CI 0.331 - 0.736, <it>P </it>< 0.01) and rs3804100 (adjusted OR, 0.509, 95% CI 0.342 - 0.759, <it>P </it>< 0.01). There was no significant association found in two TLR9 SNPs concerning the risk of HCC. The haplotype TT for TLR2 was associated significantly with the decreased risk of HCC (OR 0.524, 95% CI 0.394 - 0.697, <it>P </it>= 0.000). Inversely, the risk of HCC increased significantly in patients with the haplotype CC (OR 2.743, 95% CI 1.915 - 3.930, <it>P </it>= 0.000).</p> <p>Conclusions</p> <p>These results suggested that TLR2 rs3804099 C/T and rs3804100 C/T polymorphisms were closely associated with HCC. In addition, the haplotypes composed of these two TLR2 synonymous SNPs have stronger effects on the susceptibility of HCC.</p

    Malnutrition in cirrhosis: More food for thought

    Get PDF
    Malnutrition is highly prevalent in liver cirrhosis and its presence carries important prognostic implications. The clinical conditions and pathophysiological mechanisms that cause malnutrition in cirrhosis are multiple and interrelated. Anorexia and liver decompensation symptoms lead to poor dietary intake; metabolic changes characterised by elevated energy expenditure, reduced glycogen storage, an accelerated starvation response and protein catabolism result in muscle and fat wasting; and, malabsorption renders the cirrhotic patient unable to fully absorb or utilise food that has been consumed. Malnutrition is therefore a considerable challenge to manage effectively, particularly as liver disease progresses. A high energy, high protein diet is recognised as standard of care, yet patients struggle to follow this recommendation and there is limited evidence to guide malnutrition interventions in cirrhosis and liver transplantation. In this review, we seek to detail the factors which contribute to poor nutritional status in liver disease, and highlight complexities far greater than "poor appetite" or "reduced oral intake" leading to malnutrition. We also discuss management strategies to optimise nutritional status in this patient group, which target the inter-related mechanisms unique to advanced liver disease. Finally, future research requirements are suggested, to develop effective treatments for one of the most common and debilitating complications afflicting cirrhotic patients

    Long-term safety and efficacy of tenofovir disoproxil fumarate substitution for hepatitis B immunoglobulin following liver transplantation

    Get PDF
    Background and Aims: Limitations to the use of long-term Hepatitis B Immunoglobulin (HBIg) following liver transplantation for hepatitis B (HBV) have led to the substitution of HBIg with oral nucleo(s)tide analogue prophylaxis. We prospectively assessed the long-term safety and efficacy of switching to tenofovir disoproxil fumarate (TDF) from HBIg. Methods: An open-label, multicenter switch study was conducted to evaluate the substitution of TDF for HBIg whilst continuing lamivudine (LAM) therapy in preventing the recurrence of HBV in patients who had been maintained as hepatitis B surface antigen (HBsAg)-negative posttransplantation for at least 12 months. Results: Eighteen patients were enrolled (median age 66 years, range 42-78 years); 84% were male, and 78% on calcineurin inhibitors. Median time after transplantation was 14 years (range 5-19), and median duration of HBIg/LAM prior to the switch was 10 years (range 1-14). Median follow-up was 5 years (range 5-8). Of 18 patients, 16 (89%) remained HBsAg and HBV DNA negative at the end of follow-up. Two patients had re-emergence of HBsAg without a detectable HBV DNA and no clinical sequelae. Creatinine clearance significantly reduced (median 59 mL/min to 51 mL/min, P = 0.03), necessitating dose reduction of TDF in six (33%) participants, with two eventually ceasing TDF. One patient switched back to HBIg by choice. All patients who changed therapy maintained an undetectable HBsAg. Conclusion: Substitution of HBIg with TDF in patients on LAM is well tolerated and effective for the long-term prevention of HBV recurrence posttransplantation. Renal dysfunction occurs frequently in the posttransplant setting and can require dose adjustment of TDF or change of therapy

    Epidemiology and outcomes of acute liver failure in Australia

    Get PDF
    BACKGROUND: Acute liver failure (ALF) is a life-threatening syndrome with varying aetiologies requiring complex care and multidisciplinary management. Its changing incidence, aetiology and outcomes over the last 16 years in the Australian context remain uncertain. AIM: To describe the changing incidence, aetiology and outcomes of ALF in South Eastern Australia. METHODS: The database of the Victorian Liver Transplant Unit was interrogated to identify all cases of ALF in adults (> 16 years) in adults hospitalised between January 2002 and December 2017. Overall, 169 patients meeting criteria for ALF were identified. Demographics, aetiology of ALF, rates of transplantation and outcomes were collected for all patients. Transplant free survival and overall survival (OS) were assessed based on survival to discharge from hospital. Results were compared to data from a historical cohort from the same unit from 1988-2001. RESULTS: Paracetamol was the most common aetiology of acute liver failure, accounting for 50% of cases, with an increased incidence compared with the historical cohort (P = 0.046). Viral hepatitis and non-paracetamol drug or toxin induced liver injury accounted for 15% and 10% of cases respectively. Transplant free survival (TFS) improved significantly compared to the historical cohort (52% vs 38%, P = 0.032). TFS was highest in paracetamol toxicity with spontaneous recovery in 72% of cases compared to 31% of non-paracetamol ALF (P < 0.001). Fifty-nine patients were waitlisted for emergency liver transplantation. Nine of these died while waiting for an organ to become available. Forty-two patients (25%) underwent emergency liver transplantation with a 1, 3 and 5 year survival of 81%, 78% and 72% respectively. CONCLUSION: Paracetamol toxicity is the most common aetiology of ALF in South-Eastern Australia with a rising incidence over 30 years. TFS has improved, however it remains low in non-paracetamol ALF

    A Shift towards Pro-Inflammatory CD16+Monocyte Subsets with Preserved Cytokine Production Potential after Kidney Transplantation

    Get PDF
    <p>Background: The presence of monocyte-macrophage lineage cells in rejecting kidney transplants is associated with worse graft outcome. At present, it is still unclear how the monocyte-macrophage related responses develop after transplantation. Here, we studied the dynamics, phenotypic and functional characteristics of circulating monocytes during the first 6 months after transplantation and aimed to establish the differences between kidney transplant recipients and healthy individuals.</p><p>Methods: Phenotype, activation status and cytokine production capacity of classical (CD14++CD16-), intermediate (CD14++CD16+) and non-classical (CD14+CD16++), monocytes were determined by flow cytometry in a cohort of 33 healthy individuals, 30 renal transplant recipients at transplantation, 19 recipients at 3 months and 16 recipients at 6 months after transplantation using a cross-sectional approach.</p><p>Results: The percentage of both CD16+ monocyte subsets was significantly increased in transplant recipients compared to healthy individuals, indicative of triggered innate immunity (p</p><p>Conclusion: Our data demonstrate a skewed balance towards pro- inflammatory CD16+ monocytes that is present at the time of transplantation and retained for at least 6 months after transplantation. This shift could be one of the important drivers of early post-transplant cellular immunity.</p>
    corecore