199 research outputs found

    Pit latrines and their impacts on groundwater quality: a systematic review.

    Get PDF
    BackgroundPit latrines are one of the most common human excreta disposal systems in low-income countries, and their use is on the rise as countries aim to meet the sanitation-related target of the Millennium Development Goals. There is concern, however, that discharges of chemical and microbial contaminants from pit latrines to groundwater may negatively affect human health.ObjectivesOur goals were to a) calculate global pit latrine coverage, b) systematically review empirical studies of the impacts of pit latrines on groundwater quality, c) evaluate latrine siting standards, and d) identify knowledge gaps regarding the potential for and consequences of groundwater contamination by latrines.MethodsWe used existing survey and population data to calculate global pit latrine coverage. We reviewed the scientific literature on the occurrence of contaminants originating from pit latrines and considered the factors affecting transport of these contaminants. Data were extracted from peer-reviewed articles, books, and reports identified using Web of ScienceSM, PubMed, Google, and document reference lists.DiscussionWe estimated that approximately 1.77 billion people use pit latrines as their primary means of sanitation. Studies of pit latrines and groundwater are limited and have generally focused on only a few indicator contaminants. Although groundwater contamination is frequently observed downstream of latrines, contaminant transport distances, recommendations based on empirical studies, and siting guidelines are variable and not well aligned with one another.ConclusionsIn order to improve environmental and human health, future research should examine a larger set of contextual variables, improve measurement approaches, and develop better criteria for siting pit latrines

    Incidence of hypertension in people with HIV who are treated with integrase inhibitors versus other antiretroviral regimens in the RESPOND cohort consortium

    Full text link
    OBJECTIVE To compare the incidence of hypertension in people living with HIV receiving integrase strand transfer inhibitor (INSTI)-based antiretroviral therapy (ART) versus non-nucleoside reverse transcriptase inhibitors (NNRTIs) or boosted protease inhibitors (PIs) in the RESPOND consortium of HIV cohorts. METHODS Eligible people with HIV were aged ≥18 years who initiated a new three-drug ART regimen for the first time (baseline), did not have hypertension, and had at least two follow-up blood pressure (BP) measurements. Hypertension was defined as two consecutive systolic BP measurements ≥140 mmHg and/or diastolic BP ≥90 mmHg or initiation of antihypertensives. Multivariable Poisson regression was used to determine adjusted incidence rate ratios (aIRRs) of hypertension, overall and in those who were ART naïve or experienced at baseline. RESULTS Overall, 4606 people living with HIV were eligible (INSTIs 3164, NNRTIs 807, PIs 635). The median baseline systolic BP, diastolic BP, and age were 120 (interquartile range [IQR] 113-130) mmHg, 78 (70-82) mmHg, and 43 (34-50) years, respectively. Over 8380.4 person-years (median follow-up 1.5 [IQR 1.0-2.7] years), 1058 (23.0%) participants developed hypertension (incidence rate 126.2/1000 person-years, 95% confidence interval [CI] 118.9-134.1). Participants receiving INSTIs had a higher incidence of hypertension than those receiving NNRTIs (aIRR 1.76; 95% CI 1.47-2.11), whereas the incidence was no different in those receiving PIs (aIRR 1.07; 95% CI 0.89-1.29). The results were similar when the analysis was stratified by ART status at baseline. CONCLUSION Although unmeasured confounding and channelling bias cannot be excluded, INSTIs were associated with a higher incidence of hypertension than were NNRTIs, but rates were similar to those of PIs overall, in ART-naïve and ART-experienced participants within RESPOND

    miR-34a Promotes Vascular Smooth Muscle Cell Calcification by Downregulating SIRT1 (Sirtuin 1) and Axl (AXL Receptor Tyrosine Kinase).

    Get PDF
    Objective- Vascular calcification (VC) is age dependent and a risk factor for cardiovascular and all-cause mortality. VC involves the senescence-induced transdifferentiation of vascular smooth muscle cells (SMCs) toward an osteochondrogenic lineage resulting in arterial wall mineralization. miR-34a increases with age in aortas and induces vascular SMC senescence through the modulation of its target SIRT1 (sirtuin 1). In this study, we aimed to investigate whether miR-34a regulates VC. Approach and Results- We found that miR-34a and Runx2 (Runt-related transcription factor 2) expression correlates in young and old mice. Mir34a <sup>+/+</sup> and Mir34a <sup>-/-</sup> mice were treated with vitamin D, and calcium quantification revealed that Mir34a deficiency reduces soft tissue and aorta medial calcification and the upregulation of the VC Sox9 (SRY [sex-determining region Y]-box 9) and Runx2 and the senescence p16 and p21 markers. In this model, miR-34a upregulation was transient and preceded aorta mineralization. Mir34a <sup>-/-</sup> SMCs were less prone to undergo senescence and under osteogenic conditions deposited less calcium compared with Mir34a <sup>+/+</sup> cells. Furthermore, unlike in Mir34a <sup>+/+</sup> SMC, the known VC inhibitors SIRT1 and Axl (AXL receptor tyrosine kinase) were only partially downregulated in calcifying Mir34a <sup>-/-</sup> SMC. Strikingly, constitutive miR-34a overexpression to senescence-like levels in human aortic SMCs increased calcium deposition and enhanced Axl and SIRT1 decrease during calcification. Notably, we also showed that miR-34a directly decreased Axl expression in human aortic SMC, and restoration of its levels partially rescued miR-34a-dependent growth arrest. Conclusions- miR-34a promotes VC via vascular SMC mineralization by inhibiting cell proliferation and inducing senescence through direct Axl and SIRT1 downregulation, respectively. This miRNA could be a good therapeutic target for the treatment of VC

    Determinants of passive antibody efficacy in SARS-CoV-2 infection: a systematic review and meta-analysis

    Full text link
    Background: Randomised controlled trials of passive antibodies as treatment and prophylaxis for COVID-19 have reported variable efficacy. However, the determinants of efficacy have not been identified. We aimed to assess how the dose and timing of administration affect treatment outcome. Methods: In this systematic review and meta-analysis, we extracted data from published studies of passive antibody treatment from Jan 1, 2019, to Jan 31, 2023, that were identified by searching multiple databases, including MEDLINE, PubMed, and ClinicalTrials.gov. We included only randomised controlled trials of passive antibody administration for the prevention or treatment of COVID-19. To compare administered antibody dose between different treatments, we used data on in-vitro neutralisation titres to normalise dose by antibody potency. We used mixed-effects regression and model fitting to analyse the relationship between timing, dose and efficacy. Findings: We found 58 randomised controlled trials that investigated passive antibody therapies for the treatment or prevention of COVID-19. Earlier clinical stage at treatment initiation was highly predictive of the efficacy of both monoclonal antibodies (p<0·0001) and convalescent plasma therapy (p=0·030) in preventing progression to subsequent stages, with either prophylaxis or treatment in outpatients showing the greatest effects. For the treatment of outpatients with COVID-19, we found a significant association between the dose administered and efficacy in preventing hospitalisation (relative risk 0·77; p<0·0001). Using this relationship, we predicted that no approved monoclonal antibody was expected to provide more than 30% efficacy against some omicron (B.1.1.529) subvariants, such as BQ.1.1. Interpretation: Early administration before hospitalisation and sufficient doses of passive antibody therapy are crucial to achieving high efficacy in preventing clinical progression. The relationship between dose and efficacy provides a framework for the rational assessment of future passive antibody prophylaxis and treatment strategies for COVID-19. Funding: The Australian Government Department of Health, Medical Research Future Fund, National Health and Medical Research Council, the University of New South Wales, Monash University, Haematology Society of Australia and New Zealand, Leukaemia Foundation, and the Victorian Government

    Human immunotypes impose selection on viral genotypes through viral epitope specificity

    Get PDF
    BACKGROUND: Understanding the genetic interplay between human hosts and infectious pathogens is crucial for how we interpret virulence factors. Here, we tested for associations between HIV and host genetics, and interactive genetic effects on viral load (VL) in HIV+ ART-naive clinical trial participants. METHODS: HIV genomes were sequenced and the encoded amino acid (AA) variants were associated with VL, human single nucleotide polymorphisms (SNPs) and imputed HLA alleles, using generalized linear models with Bonferroni correction. RESULTS: Human (388,501 SNPs) and HIV (3,010 variants) genetic data was available for 2,122 persons. Four HIV variants were associated with VL (p-values<1.66×10 -5). Twelve HIV variants were associated with a range of 1-512 human SNPs (p-value<4.28×10 -11). We found 46 associations between HLA alleles and HIV variants (p-values<1.29×10 -7). We found HIV variants and immunotypes when analyzed separately, were associated with lower VL, whereas the opposite was true when analyzed in concert. Epitope binding prediction showed HLA alleles to be weaker binders of associated HIV AA variants relative to alternative variants on the same position. CONCLUSIONS: Our results show the importance of immunotype specificity on viral antigenic determinants, and the identified genetic interplay puts emphasis that viral and human genetics should be studied in the context of each other

    Comparison of dissolved and particulate arsenic distributions in shallow aquifers of Chakdaha, India, and Araihazar, Bangladesh

    Get PDF
    International audienceBackground The origin of the spatial variability of dissolved As concentrations in shallow aquifers of the Bengal Basin remains poorly understood. To address this, we compare here transects of simultaneously-collected groundwater and aquifer solids perpendicular to the banks of the Hooghly River in Chakdaha, India, and the Old Brahmaputra River in Araihazar, Bangladesh. Results Variations in surface geomorphology mapped by electromagnetic conductivity indicate that permeable sandy soils are associated with underlying aquifers that are moderately reducing to a depth of 10–30 m, as indicated by acid-leachable Fe(II)/Fe ratios 5 mg L-1. More reducing aquifers are typically capped with finer-grained soils. The patterns suggest that vertical recharge through permeable soils is associated with a flux of oxidants on the banks of the Hooghly River and, further inland, in both Chakdaha and Araihazar. Moderately reducing conditions maintained by local recharge are generally associated with low As concentrations in Araihazar, but not systematically so in Chakdaha. Unlike Araihazar, there is also little correspondence in Chakdaha between dissolved As concentrations in groundwater and the P-extractable As content of aquifer particles, averaging 191 ± 122 ug As/L, 1.1 ± 1.5 mg As kg-1 (n = 43) and 108 ± 31 ug As/L, 3.1 ± 6.5 mg As kg-1 (n = 60), respectively. We tentatively attribute these differences to a combination of younger floodplain sediments, and therefore possibly more than one mechanism of As release, as well as less reducing conditions in Chakdaha compared to Araihazar. Conclusion Systematic dating of groundwater and sediment, combined with detailed mapping of the composition of aquifer solids and groundwater, will be needed to identify the various mechanisms underlying the complex distribution of As in aquifers of the Bengal Basin

    Incidence of hypertension in people with HIV who are treated with integrase inhibitors versus other antiretroviral regimens in the RESPOND cohort consortium.

    Get PDF
    OBJECTIVE To compare the incidence of hypertension in people living with HIV receiving integrase strand transfer inhibitor (INSTI)-based antiretroviral therapy (ART) versus non-nucleoside reverse transcriptase inhibitors (NNRTIs) or boosted protease inhibitors (PIs) in the RESPOND consortium of HIV cohorts. METHODS Eligible people with HIV were aged ≥18 years who initiated a new three-drug ART regimen for the first time (baseline), did not have hypertension, and had at least two follow-up blood pressure (BP) measurements. Hypertension was defined as two consecutive systolic BP measurements ≥140 mmHg and/or diastolic BP ≥90 mmHg or initiation of antihypertensives. Multivariable Poisson regression was used to determine adjusted incidence rate ratios (aIRRs) of hypertension, overall and in those who were ART naïve or experienced at baseline. RESULTS Overall, 4606 people living with HIV were eligible (INSTIs 3164, NNRTIs 807, PIs 635). The median baseline systolic BP, diastolic BP, and age were 120 (interquartile range [IQR] 113-130) mmHg, 78 (70-82) mmHg, and 43 (34-50) years, respectively. Over 8380.4 person-years (median follow-up 1.5 [IQR 1.0-2.7] years), 1058 (23.0%) participants developed hypertension (incidence rate 126.2/1000 person-years, 95% confidence interval [CI] 118.9-134.1). Participants receiving INSTIs had a higher incidence of hypertension than those receiving NNRTIs (aIRR 1.76; 95% CI 1.47-2.11), whereas the incidence was no different in those receiving PIs (aIRR 1.07; 95% CI 0.89-1.29). The results were similar when the analysis was stratified by ART status at baseline. CONCLUSION Although unmeasured confounding and channelling bias cannot be excluded, INSTIs were associated with a higher incidence of hypertension than were NNRTIs, but rates were similar to those of PIs overall, in ART-naïve and ART-experienced participants within RESPOND

    Biomarker-indicated extent of oxidation of plant-derived organic carbon (OC) in relation to geomorphology in an arsenic contaminated Holocene aquifer, Cambodia

    Get PDF
    The poisoning of rural populations in South and Southeast Asia due to high groundwater arsenic concentrations is one of the world’s largest ongoing natural disasters. It is important to consider environmental processes related to the release of geogenic arsenic, including geomorphological and organic geochemical processes. Arsenic is released from sediments when iron-oxide minerals, onto which arsenic is adsorbed or incorporated, react with organic carbon (OC) and the OC is oxidised. In this study we build a new geomorphological framework for Kandal Province, a highly studied arsenic affected region of Cambodia, and tie this into wider regional environmental change throughout the Holocene. Analyses shows that the concentration of OC in the sediments is strongly inversely correlated to grainsize. Furthermore, the type of OC is also related to grain size with the clay containing mostly (immature) plant derived OC and sand containing mostly thermally mature derived OC. Finally, analyses indicate that within the plant derived OC relative oxidation is strongly grouped by stratigraphy with the older bound OC more oxidised than younger OC
    corecore