32 research outputs found

    Assessing the role of groundwater recharge from tanks in crystalline bedrock aquifers in Karnataka, India, using hydrochemical tracers

    Get PDF
    The majority of India’s rural drinking water supply is sourced from groundwater, which also plays a critical role in irrigated agriculture, supporting the livelihoods of millions of users. However, recent high abstractions are threatening the sustainable use of groundwater, and action is needed to ensure continued supply. Increased managed aquifer recharge (MAR) using the > 200,000 existing tanks (artificially created surface water bodies) is one of the Indian government’s key initiatives to combat declining groundwater levels. However, few studies have directly examined the effectiveness of tank recharge, particularly in the complex fractured hydrogeology of Peninsular India. To address this gap, this study examined the impact of tanks in three crystalline bedrock catchments in Karnataka, southern India, by analysing the isotopic and hydrochemical composition of surface waters and groundwaters, combined with groundwater level observations. The results indicate that tanks have limited impact on regional groundwater recharge and quality in rural areas, where recharge from precipitation and groundwater recycling from irrigation dominate the recharge signal. In the urban setting (Bengaluru), impermeable surfaces increased the relative effect of recharge from point sources such as tanks and rivers, but where present, pipe leakage from public-water-supply accounted for the majority of recharge. Shallow groundwater levels in the inner parts of the city may lead to groundwater discharge to tanks, particularly in the dry season. We conclude that the importance of aquifer recharge from tanks is limited compared to other recharge sources and highly dependent on the specific setting. Additional studies to quantify tank recharge and revisions to the current guidelines for national groundwater recharge estimations, using a less generalised approach, are recommended to avoid over-estimating the role tanks play in groundwater recharge

    The natural history of primary sclerosing cholangitis in 781 children. A multicenter, international collaboration

    Get PDF
    There are limited data on the natural history of primary sclerosing cholangitis (PSC) in children. We aimed to describe the disease characteristics and long-term outcomes of pediatric PSC. We retrospectively collected all pediatric PSC cases from 36 participating institutions and conducted a survival analysis from the date of PSC diagnosis to dates of diagnosis of portal hypertensive or biliary complications, cholangiocarcinoma, liver transplantation, or death. We analyzed patients grouped by disease phenotype and laboratory studies at diagnosis to identify objective predictors of long-term outcome. We identified 781 patients, median age 12 years, with 4,277 person-years of follow-up; 33% with autoimmune hepatitis, 76% with inflammatory bowel disease, and 13% with small duct PSC. Portal hypertensive and biliary complications developed in 38% and 25%, respectively, after 10 years of disease. Once these complications developed, median survival with native liver was 2.8 and 3.5 years, respectively. Cholangiocarcinoma occurred in 1%. Overall event-free survival was 70% at 5 years and 53% at 10 years. Patient groups with the most elevated total bilirubin, gamma-glutamyltransferase, and aspartate aminotransferase-to-platelet ratio index at diagnosis had the worst outcomes. In multivariate analysis PSC-inflammatory bowel disease and small duct phenotypes were associated with favorable prognosis (hazard ratios 0.6, 95% confidence interval 0.5-0.9, and 0.7, 95% confidence interval 0.5-0.96, respectively). Age, gender, and autoimmune hepatitis overlap did not impact long-term outcome. CONCLUSION: PSC has a chronic, progressive course in children, and nearly half of patients develop an adverse liver outcome after 10 years of disease; elevations in bilirubin, gamma-glutamyltransferase, and aspartate aminotransferase-to-platelet ratio index at diagnosis can identify patients at highest risk; small duct PSC and PSC-inflammatory bowel disease are more favorable disease phenotypes

    Ursodeoxycholic Acid Therapy in Pediatric Primary Sclerosing Cholangitis : Predictors of Gamma Glutamyltransferase Normalization and Favorable Clinical Course

    Get PDF
    Objective To investigate patient factors predictive of gamma glutamyltransferase (GGT) normalization following ursodeoxycholic acid (UDCA) therapy in children with primary sclerosing cholangitis. Study design We retrospectively reviewed patient records at 46 centers. We included patients with a baseline serum GGT level >= 50 IU/L at diagnosis of primary sclerosing cholangitis who initiated UDCA therapy within 1 month and continued therapy for at least 1 year. We defined "normalization" as a GGT level Results We identified 263 patients, median age 12.1 years at diagnosis, treated with UDCA at a median dose of 15 mg/kg/d. Normalization occurred in 46%. Patients with normalization had a lower prevalence of Crohn's disease, lower total bilirubin level, lower aspartate aminotransferase to platelet ratio index, greater platelet count, and greater serum albumin level at diagnosis. The 5-year survival with native liver was 99% in those patients who achieved normalization vs 77% in those who did not. Conclusions Less than one-half of the patients treated with UDCA have a complete GGT normalization in the first year after diagnosis, but this subset of patients has a favorable 5-year outcome. Normalization is less likely in patients with a Crohn's disease phenotype or a laboratory profile suggestive of more advanced hepatobiliary fibrosis. Patients who do not achieve normalization could reasonably stop UDCA, as they are likely not receiving clinical benefit. Alternative treatments with improved efficacy are needed, particularly for patients with already-advanced disease.Peer reviewe

    Assessing the Validity of Adult-derived Prognostic Models for Primary Sclerosing Cholangitis Outcomes in Children

    Get PDF
    Background: Natural history models for primary sclerosing cholangitis (PSC) are derived from adult patient data, but have never been validated in children. It is unclear how accurate such models are for children with PSC. Methods: We utilized the pediatric PSC consortium database to assess the Revised Mayo Clinic, Amsterdam-Oxford, and Boberg models. We calculated the risk stratum and predicted survival for each patient within each model using patient data at PSC diagnosis, and compared it with observed survival. We evaluated model fit using the c-statistic. Results: Model fit was good at 1 year (c-statistics 0.93, 0.87, 0.82) and fair at 10 years (0.78, 0.75, 0.69) in the Mayo, Boberg, and Amsterdam-Oxford models, respectively. The Mayo model correctly classified most children as low risk, whereas the Amsterdam-Oxford model incorrectly classified most as high risk. All of the models underestimated survival of patients classified as high risk. Albumin, bilirubin, AST, and platelets were most associated with outcomes. Autoimmune hepatitis was more prevalent in higher risk groups, and over-weighting of AST in these patients accounted for the observed versus predicted survival discrepancy. Conclusions: All 3 models offered good short-term discrimination of outcomes but only fair long-term discrimination. None of the models account for the high prevalence of features of autoimmune hepatitis overlap in children and the associated elevated aminotransferases. A pediatric-specific model is needed. AST, bilirubin, albumin, and platelets will be important predictors, but must be weighted to account for the unique features of PSC in children.Peer reviewe

    Global, regional, and national incidence and mortality for HIV, tuberculosis, and malaria during 1990–2013: a systematic analysis for the Global Burden of Disease Study 2013

    Get PDF
    BACKGROUND: The Millennium Declaration in 2000 brought special global attention to HIV, tuberculosis, and malaria through the formulation of Millennium Development Goal (MDG) 6. The Global Burden of Disease 2013 study provides a consistent and comprehensive approach to disease estimation for between 1990 and 2013, and an opportunity to assess whether accelerated progress has occured since the Millennium Declaration. METHODS: To estimate incidence and mortality for HIV, we used the UNAIDS Spectrum model appropriately modified based on a systematic review of available studies of mortality with and without antiretroviral therapy (ART). For concentrated epidemics, we calibrated Spectrum models to fit vital registration data corrected for misclassification of HIV deaths. In generalised epidemics, we minimised a loss function to select epidemic curves most consistent with prevalence data and demographic data for all-cause mortality. We analysed counterfactual scenarios for HIV to assess years of life saved through prevention of mother-to-child transmission (PMTCT) and ART. For tuberculosis, we analysed vital registration and verbal autopsy data to estimate mortality using cause of death ensemble modelling. We analysed data for corrected case-notifications, expert opinions on the case-detection rate, prevalence surveys, and estimated cause-specific mortality using Bayesian meta-regression to generate consistent trends in all parameters. We analysed malaria mortality and incidence using an updated cause of death database, a systematic analysis of verbal autopsy validation studies for malaria, and recent studies (2010-13) of incidence, drug resistance, and coverage of insecticide-treated bednets. FINDINGS: Globally in 2013, there were 1·8 million new HIV infections (95% uncertainty interval 1·7 million to 2·1 million), 29·2 million prevalent HIV cases (28·1 to 31·7), and 1·3 million HIV deaths (1·3 to 1·5). At the peak of the epidemic in 2005, HIV caused 1·7 million deaths (1·6 million to 1·9 million). Concentrated epidemics in Latin America and eastern Europe are substantially smaller than previously estimated. Through interventions including PMTCT and ART, 19·1 million life-years (16·6 million to 21·5 million) have been saved, 70·3% (65·4 to 76·1) in developing countries. From 2000 to 2011, the ratio of development assistance for health for HIV to years of life saved through intervention was US$4498 in developing countries. Including in HIV-positive individuals, all-form tuberculosis incidence was 7·5 million (7·4 million to 7·7 million), prevalence was 11·9 million (11·6 million to 12·2 million), and number of deaths was 1·4 million (1·3 million to 1·5 million) in 2013. In the same year and in only individuals who were HIV-negative, all-form tuberculosis incidence was 7·1 million (6·9 million to 7·3 million), prevalence was 11·2 million (10·8 million to 11·6 million), and number of deaths was 1·3 million (1·2 million to 1·4 million). Annualised rates of change (ARC) for incidence, prevalence, and death became negative after 2000. Tuberculosis in HIV-negative individuals disproportionately occurs in men and boys (versus women and girls); 64·0% of cases (63·6 to 64·3) and 64·7% of deaths (60·8 to 70·3). Globally, malaria cases and deaths grew rapidly from 1990 reaching a peak of 232 million cases (143 million to 387 million) in 2003 and 1·2 million deaths (1·1 million to 1·4 million) in 2004. Since 2004, child deaths from malaria in sub-Saharan Africa have decreased by 31·5% (15·7 to 44·1). Outside of Africa, malaria mortality has been steadily decreasing since 1990. INTERPRETATION: Our estimates of the number of people living with HIV are 18·7% smaller than UNAIDS's estimates in 2012. The number of people living with malaria is larger than estimated by WHO. The number of people living with HIV, tuberculosis, or malaria have all decreased since 2000. At the global level, upward trends for malaria and HIV deaths have been reversed and declines in tuberculosis deaths have accelerated. 101 countries (74 of which are developing) still have increasing HIV incidence. Substantial progress since the Millennium Declaration is an encouraging sign of the effect of global action. FUNDING: Bill & Melinda Gates Foundation

    Evaluating the Effects of SARS-CoV-2 Spike Mutation D614G on Transmissibility and Pathogenicity.

    Get PDF
    Global dispersal and increasing frequency of the SARS-CoV-2 spike protein variant D614G are suggestive of a selective advantage but may also be due to a random founder effect. We investigate the hypothesis for positive selection of spike D614G in the United Kingdom using more than 25,000 whole genome SARS-CoV-2 sequences. Despite the availability of a large dataset, well represented by both spike 614 variants, not all approaches showed a conclusive signal of positive selection. Population genetic analysis indicates that 614G increases in frequency relative to 614D in a manner consistent with a selective advantage. We do not find any indication that patients infected with the spike 614G variant have higher COVID-19 mortality or clinical severity, but 614G is associated with higher viral load and younger age of patients. Significant differences in growth and size of 614G phylogenetic clusters indicate a need for continued study of this variant

    Hospital admission and emergency care attendance risk for SARS-CoV-2 delta (B.1.617.2) compared with alpha (B.1.1.7) variants of concern: a cohort study

    Get PDF
    Background: The SARS-CoV-2 delta (B.1.617.2) variant was first detected in England in March, 2021. It has since rapidly become the predominant lineage, owing to high transmissibility. It is suspected that the delta variant is associated with more severe disease than the previously dominant alpha (B.1.1.7) variant. We aimed to characterise the severity of the delta variant compared with the alpha variant by determining the relative risk of hospital attendance outcomes. Methods: This cohort study was done among all patients with COVID-19 in England between March 29 and May 23, 2021, who were identified as being infected with either the alpha or delta SARS-CoV-2 variant through whole-genome sequencing. Individual-level data on these patients were linked to routine health-care datasets on vaccination, emergency care attendance, hospital admission, and mortality (data from Public Health England's Second Generation Surveillance System and COVID-19-associated deaths dataset; the National Immunisation Management System; and NHS Digital Secondary Uses Services and Emergency Care Data Set). The risk for hospital admission and emergency care attendance were compared between patients with sequencing-confirmed delta and alpha variants for the whole cohort and by vaccination status subgroups. Stratified Cox regression was used to adjust for age, sex, ethnicity, deprivation, recent international travel, area of residence, calendar week, and vaccination status. Findings: Individual-level data on 43 338 COVID-19-positive patients (8682 with the delta variant, 34 656 with the alpha variant; median age 31 years [IQR 17–43]) were included in our analysis. 196 (2·3%) patients with the delta variant versus 764 (2·2%) patients with the alpha variant were admitted to hospital within 14 days after the specimen was taken (adjusted hazard ratio [HR] 2·26 [95% CI 1·32–3·89]). 498 (5·7%) patients with the delta variant versus 1448 (4·2%) patients with the alpha variant were admitted to hospital or attended emergency care within 14 days (adjusted HR 1·45 [1·08–1·95]). Most patients were unvaccinated (32 078 [74·0%] across both groups). The HRs for vaccinated patients with the delta variant versus the alpha variant (adjusted HR for hospital admission 1·94 [95% CI 0·47–8·05] and for hospital admission or emergency care attendance 1·58 [0·69–3·61]) were similar to the HRs for unvaccinated patients (2·32 [1·29–4·16] and 1·43 [1·04–1·97]; p=0·82 for both) but the precision for the vaccinated subgroup was low. Interpretation: This large national study found a higher hospital admission or emergency care attendance risk for patients with COVID-19 infected with the delta variant compared with the alpha variant. Results suggest that outbreaks of the delta variant in unvaccinated populations might lead to a greater burden on health-care services than the alpha variant. Funding: Medical Research Council; UK Research and Innovation; Department of Health and Social Care; and National Institute for Health Research

    Evaluating the Effects of SARS-CoV-2 Spike Mutation D614G on Transmissibility and Pathogenicity

    Get PDF
    Global dispersal and increasing frequency of the SARS-CoV-2 spike protein variant D614G are suggestive of a selective advantage but may also be due to a random founder effect. We investigate the hypothesis for positive selection of spike D614G in the United Kingdom using more than 25,000 whole genome SARS-CoV-2 sequences. Despite the availability of a large dataset, well represented by both spike 614 variants, not all approaches showed a conclusive signal of positive selection. Population genetic analysis indicates that 614G increases in frequency relative to 614D in a manner consistent with a selective advantage. We do not find any indication that patients infected with the spike 614G variant have higher COVID-19 mortality or clinical severity, but 614G is associated with higher viral load and younger age of patients. Significant differences in growth and size of 614G phylogenetic clusters indicate a need for continued study of this variant

    Changes in symptomatology, reinfection, and transmissibility associated with the SARS-CoV-2 variant B.1.1.7: an ecological study

    Get PDF
    Background The SARS-CoV-2 variant B.1.1.7 was first identified in December, 2020, in England. We aimed to investigate whether increases in the proportion of infections with this variant are associated with differences in symptoms or disease course, reinfection rates, or transmissibility. Methods We did an ecological study to examine the association between the regional proportion of infections with the SARS-CoV-2 B.1.1.7 variant and reported symptoms, disease course, rates of reinfection, and transmissibility. Data on types and duration of symptoms were obtained from longitudinal reports from users of the COVID Symptom Study app who reported a positive test for COVID-19 between Sept 28 and Dec 27, 2020 (during which the prevalence of B.1.1.7 increased most notably in parts of the UK). From this dataset, we also estimated the frequency of possible reinfection, defined as the presence of two reported positive tests separated by more than 90 days with a period of reporting no symptoms for more than 7 days before the second positive test. The proportion of SARS-CoV-2 infections with the B.1.1.7 variant across the UK was estimated with use of genomic data from the COVID-19 Genomics UK Consortium and data from Public Health England on spike-gene target failure (a non-specific indicator of the B.1.1.7 variant) in community cases in England. We used linear regression to examine the association between reported symptoms and proportion of B.1.1.7. We assessed the Spearman correlation between the proportion of B.1.1.7 cases and number of reinfections over time, and between the number of positive tests and reinfections. We estimated incidence for B.1.1.7 and previous variants, and compared the effective reproduction number, Rt, for the two incidence estimates. Findings From Sept 28 to Dec 27, 2020, positive COVID-19 tests were reported by 36 920 COVID Symptom Study app users whose region was known and who reported as healthy on app sign-up. We found no changes in reported symptoms or disease duration associated with B.1.1.7. For the same period, possible reinfections were identified in 249 (0·7% [95% CI 0·6–0·8]) of 36 509 app users who reported a positive swab test before Oct 1, 2020, but there was no evidence that the frequency of reinfections was higher for the B.1.1.7 variant than for pre-existing variants. Reinfection occurrences were more positively correlated with the overall regional rise in cases (Spearman correlation 0·56–0·69 for South East, London, and East of England) than with the regional increase in the proportion of infections with the B.1.1.7 variant (Spearman correlation 0·38–0·56 in the same regions), suggesting B.1.1.7 does not substantially alter the risk of reinfection. We found a multiplicative increase in the Rt of B.1.1.7 by a factor of 1·35 (95% CI 1·02–1·69) relative to pre-existing variants. However, Rt fell below 1 during regional and national lockdowns, even in regions with high proportions of infections with the B.1.1.7 variant. Interpretation The lack of change in symptoms identified in this study indicates that existing testing and surveillance infrastructure do not need to change specifically for the B.1.1.7 variant. In addition, given that there was no apparent increase in the reinfection rate, vaccines are likely to remain effective against the B.1.1.7 variant. Funding Zoe Global, Department of Health (UK), Wellcome Trust, Engineering and Physical Sciences Research Council (UK), National Institute for Health Research (UK), Medical Research Council (UK), Alzheimer's Society

    Genomic assessment of quarantine measures to prevent SARS-CoV-2 importation and transmission

    Get PDF
    Mitigation of SARS-CoV-2 transmission from international travel is a priority. We evaluated the effectiveness of travellers being required to quarantine for 14-days on return to England in Summer 2020. We identified 4,207 travel-related SARS-CoV-2 cases and their contacts, and identified 827 associated SARS-CoV-2 genomes. Overall, quarantine was associated with a lower rate of contacts, and the impact of quarantine was greatest in the 16–20 age-group. 186 SARS-CoV-2 genomes were sufficiently unique to identify travel-related clusters. Fewer genomically-linked cases were observed for index cases who returned from countries with quarantine requirement compared to countries with no quarantine requirement. This difference was explained by fewer importation events per identified genome for these cases, as opposed to fewer onward contacts per case. Overall, our study demonstrates that a 14-day quarantine period reduces, but does not completely eliminate, the onward transmission of imported cases, mainly by dissuading travel to countries with a quarantine requirement
    corecore