111 research outputs found

    Blocking Connexin-43 mediated hemichannel activity protects against early tubular injury in experimental chronic kidney disease

    Get PDF
    Background: Tubulointerstitial fibrosis represents the key underlying pathology of Chronic Kidney Disease (CKD), yet treatment options remain limited. In this study, we investigated the role of connexin43 (Cx43) hemichannel-mediated adenosine triphosphate (ATP) release in purinergic-mediated disassembly of adherens and tight junction complexes in early tubular injury. Methods: Human primary proximal tubule epithelial cells (hPTECs) and clonal tubular epithelial cells (HK2) were treated with Transforming Growth Factor Beta1 (TGFβ1) ± apyrase, or ATPγS for 48h. For inhibitor studies, cells were co-incubated with Cx43 mimetic Peptide 5, or purinergic receptor antagonists Suramin, A438079 or A804598. Immunoblotting, single-cell force spectroscopy and trans-epithelial electrical resistance assessed protein expression, cell-cell adhesion and paracellular permeability. Carboxyfluorescein uptake and biosensing measured hemichannel activity and real-time ATP release, whilst a heterozygous Cx43+/- mouse model with unilateral ureteral obstruction (UUO) assessed the role of Cx43 in vivo. Results: Immunohistochemistry of biopsy material from patients with diabetic nephropathy confirmed increased expression of purinergic receptor P2X7. TGFβ1 increased Cx43 mediated hemichannel activity and ATP release in hPTECs and HK2 cells. The cytokine reduced maximum unbinding forces and reduced cell-cell adhesion, which translated to increased paracellular permeability. Changes were reversed when cells were co-incubated with either Peptide 5 or P2-purinoceptor inhibitors. Cx43+/- mice did not exhibit protein changes associated with early tubular injury in a UUO model of fibrosis. Conclusion: Data suggest that Cx43 mediated ATP release represents an initial trigger in early tubular injury via its actions on the adherens and tight junction complex. Since Cx43 is highly expressed in nephropathy, it represents a novel target for intervention of tubulointerstitial fibrosis in CKD

    The six-minute walk test in community dwelling elderly: influence of health status.

    Get PDF
    BACKGROUND: The 6 minutes walk test (6MWT) is a useful assessment instrument for the exercise capacity of elderly persons. The impact of the health status on the 6MWT-distance in elderly, however, remains unclear, reducing its value in clinical settings. The objective of this study was to investigate to what extent the 6MWT-distance in community dwelling elderly is determined by health conditions. METHODS: One hundred and fifty-six community dwelling elderly people (53 male, 103 female) were assessed for health status and performed the 6MWT. After clinical evaluation, electrocardiography and laboratory examination participants were categorized into a stratified six-level classification system according to their health status, going from A (completely healthy) to D (signs of active disease at the moment of examination). RESULTS: The mean 6MWT-distance was 603 m (SD = 178). The 6MWT-distance decreased significantly with increasing age (ANOVA p = 0.0001) and with worsening health status (ANCOVA, corrected for age p < 0.001). A multiple linear regression model with health status, age and gender as independent variables explained 31% of the 6MWT-distance variability. Anthropometrical measures (stature, weight and BMI) did not significantly improve the prediction model. A significant relationship between 6MWT-distance and stature was only present in category A (completely healthy). CONCLUSIONS: Significant differences in 6MWT-distance are observed according to health status in community-dwelling elderly persons. The proposed health categorizing system for elderly people is able to distinguish persons with lower physical exercise capacity and can be useful when advising physical trainers for seniors

    IL1RL1 Gene Variants and Nasopharyngeal IL1RL-a Levels Are Associated with Severe RSV Bronchiolitis: A Multicenter Cohort Study

    Get PDF
    Targets for intervention are required for respiratory syncytial virus (RSV) bronchiolitis, a common disease during infancy for which no effective treatment exists. Clinical and genetic studies indicate that IL1RL1 plays an important role in the development and exacerbations of asthma. Human IL1RL1 encodes three isoforms, including soluble IL1RL1-a, that can influence IL33 signalling by modifying inflammatory responses to epithelial damage. We hypothesized that IL1RL1 gene variants and soluble IL1RL1-a are associated with severe RSV bronchiolitis.We studied the association between RSV and 3 selected IL1RL1 single-nucleotide polymorphisms rs1921622, rs11685480 or rs1420101 in 81 ventilated and 384 non-ventilated children under 1 year of age hospitalized with primary RSV bronchiolitis in comparison to 930 healthy controls. Severe RSV infection was defined by need for mechanical ventilation. Furthermore, we examined soluble IL1RL1-a concentration in nasopharyngeal aspirates from children hospitalized with primary RSV bronchiolitis. An association between SNP rs1921622 and disease severity was found at the allele and genotype level (p = 0.011 and p = 0.040, respectively). In hospitalized non-ventilated patients, RSV bronchiolitis was not associated with IL1RL1 genotypes. Median concentrations of soluble IL1RL1-a in nasopharyngeal aspirates were >20-fold higher in ventilated infants when compared to non-ventilated infants with RSV (median [and quartiles] 9,357 [936-15,528] pg/ml vs. 405 [112-1,193] pg/ml respectively; p<0.001).We found a genetic link between rs1921622 IL1RL1 polymorphism and disease severity in RSV bronchiolitis. The potential biological role of IL1RL1 in the pathogenesis of severe RSV bronchiolitis was further supported by high local concentrations of IL1RL1 in children with most severe disease. We speculate that IL1RL1a modifies epithelial damage mediated inflammatory responses during RSV bronchiolitis and thus may serve as a novel target for intervention to control disease severity

    Neuropsychiatric Symptoms in Patients with Aortic Aneurysms

    Get PDF
    BACKGROUND: Emerging evidence suggests that vascular disease confers vulnerability to a late-onset of depressive illness and the impairment of specific cognitive functions, most notably in the domains of memory storage and retrieval. Lower limb athero-thrombosis and abdominal aortic aneurysm (AAA) have both been previously associated with neuropsychiatric symptoms possibly due to associated intracerebral vascular disease or systemic inflammation, hence suggesting that these illnesses may be regarded as models to investigate the vascular genesis of neuropsychiatric symptoms. The aim of this study was to compare neuropsychiatric symptoms such as depression, anxiety and a variety of cognitive domains in patients who had symptoms of peripheral athero-thrombosis (intermittent claudication) and those who had an asymptomatic abdominal aortic aneurysm AAA. METHODOLOGY/PRINCIPAL FINDINGS: In a cross-sectional study, 26 participants with either intermittent claudication or AAA were assessed using a detailed neuropsychiatric assessment battery for various cognitive domains and depression and anxiety symptoms (Hamilton Depression and Anxiety Scales). Student t test and linear regression analyses were applied to compare neuropsychiatric symptoms between patient groups. AAA participants showed greater levels of cognitive impairment in the domains of immediate and delayed memory as compared to patients who had intermittent claudication. Cognitive dysfunction was best predicted by increasing aortic diameter. CRP was positively related to AAA diameter, but not to cognitive function. AAA and aortic diameter in particular were associated with cognitive dysfunction in this study. CONCLUSIONS/SIGNIFICANCE: AAA patients are at a higher risk for cognitive impairment than intermittent claudication patients. Validation of this finding is required in a larger study, but if confirmed could suggest that systemic factors peculiar to AAA may impact on cognitive function.Bernhard T. Baune, Steven J. Unwin, Frances Quirk and Jonathan Golledg

    IL-10 Blocks the Development of Resistance to Re-Infection with Schistosoma mansoni

    Get PDF
    Despite effective chemotherapy to treat schistosome infections, re-infection rates are extremely high. Resistance to reinfection can develop, however it typically takes several years following numerous rounds of treatment and re-infection, and often develops in only a small cohort of individuals. Using a well-established and highly permissive mouse model, we investigated whether immunoregulatory mechanisms influence the development of resistance. Following Praziquantel (PZQ) treatment of S. mansoni infected mice we observed a significant and mixed anti-worm response, characterized by Th1, Th2 and Th17 responses. Despite the elevated anti-worm response in PBMC's, liver, spleen and mesenteric lymph nodes, this did not confer any protection from a secondary challenge infection. Because a significant increase in IL-10-producing CD4+CD44+CD25+GITR+ lymphocytes was observed, we hypothesised that IL-10 was obstructing the development of resistance. Blockade of IL-10 combined with PZQ treatment afforded a greater than 50% reduction in parasite establishment during reinfection, compared to PZQ treatment alone, indicating that IL-10 obstructs the development of acquired resistance. Markedly enhanced Th1, Th2 and Th17 responses, worm-specific IgG1, IgG2b and IgE and circulating eosinophils characterized the protection. This study demonstrates that blocking IL-10 signalling during PZQ treatment can facilitate the development of protective immunity and provide a highly effective strategy to protect against reinfection with S. mansoni

    Phylogenetic structure and host abundance drive disease pressure in communities

    Full text link
    Pathogens play an important part in shaping the structure and dynamics of natural communities, because species are not affected by them equally. A shared goal of ecology and epidemiology is to predict when a species is most vulnerable to disease. A leading hypothesis asserts that the impact of disease should increase with host abundance, producing a ‘rare-species advantage. However, the impact of a pathogen may be decoupled from host abundance, because most pathogens infect more than one species, leading to pathogen spillover onto closely related species. Here we show that the phylogenetic and ecological structure of the surrounding community can be important predictors of disease pressure. We found that the amount of tissue lost to disease increased with the relative abundance of a species across a grassland plant community, and that this rare-species advantage had an additional phylogenetic component: disease pressure was stronger on species with many close relatives. We used a global model of pathogen sharing as a function of relatedness between hosts, which provided a robust predictor of relative disease pressure at the local scale. In our grassland, the total amount of disease was most accurately explained not by the abundance of the focal host alone, but by the abundance of all species in the community weighted by their phylogenetic distance to the host. Furthermore, the model strongly predicted observed disease pressure for 44 novel host species we introduced experimentally to our study site, providing evidence for a mechanism to explain why phylogenetically rare species are more likely to become invasive when introduced. Our results demonstrate how the phylogenetic and ecological structure of communities can have a key role in disease dynamics, with implications for the maintenance of biodiversity, biotic resistance against introduced weeds, and the success of managed plants in agriculture and forestry

    Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.

    Get PDF
    BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6 months was conducted. Follow-up lasted 30 days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, p = 0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, p = 0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, p < 0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, p < 0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112

    Pooled analysis of WHO Surgical Safety Checklist use and mortality after emergency laparotomy

    Get PDF
    Background The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89.6 per cent) compared with that in countries with a middle (753 of 1242, 60.6 per cent; odds ratio (OR) 0.17, 95 per cent c.i. 0.14 to 0.21, P <0001) or low (363 of 860, 422 per cent; OR 008, 007 to 010, P <0.001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference -94 (95 per cent c.i. -11.9 to -6.9) per cent; P <0001), but the relationship was reversed in low-HDI countries (+121 (+7.0 to +173) per cent; P <0001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0.60, 0.50 to 073; P <0.001). The greatest absolute benefit was seen for emergency surgery in low- and middle-HDI countries. Conclusion Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries.Peer reviewe
    corecore