72 research outputs found

    An evaluation of the site specificity of soil elemental signatures for identifying and interpreting former functional areas

    Get PDF
    Soil multi-element analysis is now a routine technique employed to help answer questions about space use and function in and around archaeological sites. The pattern of enhancement of certain elements, including P, Pb, Ca, Zn, and Cu, has been shown by numerous studies to correlate closely with the archaeological and historical record. Interpretation of these soil signatures, however, has generally been more problematic. One approach to the problem has been the use of ethnographic or “known” sites to guide interpretation, but how confidently can results from one site be extrapolated to another? This study of abandoned farms tests the site specificity of soil multi-element signatures of past space use through the use of discriminant models. Data analysis suggests that one to one comparisons of similar sites are much less accurate (38% accuracy) than comparisons based on a wider range of sites (59.3% accuracy), even when the latter have contrasting geology. The results highlight the importance of individual anthropogenic practices during occupation and abandonment in the development of diagnostic soil geochemical signatures

    Treatment-limiting renal tubulopathy in patients treated with tenofovir disoproxil fumarate.

    Get PDF
    OBJECTIVES: Tenofovir disoproxil fumarate (TDF) is widely used in the treatment or prevention of HIV and hepatitis B infection. TDF may cause renal tubulopathy in a small proportion of recipients. We aimed to study the risk factors for developing severe renal tubulopathy. METHODS: We conducted an observational cohort study with retrospective identification of cases of treatment-limiting tubulopathy during TDF exposure. We used multivariate Poisson regression analysis to identify risk factors for tubulopathy, and mixed effects models to analyse adjusted estimated glomerular filtration rate (eGFR) slopes. RESULTS: Between October 2002 and June 2013, 60 (0.4%) of 15,983 patients who had received TDF developed tubulopathy after a median exposure of 44.1 (IQR 20.4, 64.4) months. Tubulopathy cases were predominantly male (92%), of white ethnicity (93%), and exposed to antiretroviral regimens that contained boosted protease inhibitors (PI, 90%). In multivariate analysis, age, ethnicity, CD4 cell count and use of didanosine or PI were significantly associated with tubulopathy. Tubulopathy cases experienced significantly greater eGFR decline while receiving TDF than the comparator group (-6.60 [-7.70, -5.50] vs. -0.34 [-0.43, -0.26] mL/min/1.73 m2/year, p < 0.0001). CONCLUSIONS: Older age, white ethnicity, immunodeficiency and co-administration of ddI and PI were risk factors for tubulopathy in patients who received TDF-containing antiretroviral therapy. The presence of rapid eGFR decline identified TDF recipients at increased risk of tubulopathy

    Prognostic importance of anaemia in HIV type-1-infected patients starting antiretroviral therapy: collaborative analysis of prospective cohort studies

    Get PDF
    Background: In HIV type-1-infected patients starting highly active antiretroviral therapy (HAART), the prognostic value of haemoglobin when starting HAART, and of changes in haemoglobin levels, are not well defined. Methods: We combined data from 10 prospective studies of 12,100 previously untreated individuals (25% women). A total of 4,222 patients (35%) were anaemic: 131 patients (1.1%) had severe (<8.0 g/dl), 1,120 (9%) had moderate (male 8.0-<11.0 g/dl and female 8.0-<10.0g/dl) and 2,971 (25%) had mild (male 11.0-<13.0g/dl and female 10.0-<12.0 g/dl) anaemia. We separately analysed progression to AIDS or death from baseline and from 6 months using Weibull models, adjusting for CD4+ T-cell count, age, sex and other variables. Results: During 48,420 person-years of follow-up 1,448 patients developed at least one AIDS event and 857 patients died. Anaemia at baseline was independently associated with higher mortality: the adjusted hazard ratio (95% confidence interval) for mild anaemia was 1.42 (1.17-1.73), for moderate anaemia 2.56 (2.07-3.18) and for severe anaemia 5.26 (3.55-7.81). Corresponding figures for progression to AIDS were 1.60 (1.37-1.86), 2.00 (1.66-2.40) and 2.24 (1.46-3.42). At 6 months the prevalence of anaemia declined to 26%. Baseline anaemia continued to predict mortality (and to a lesser extent progression to AIDS) in patients with normal haemoglobin or mild anaemia at 6 months. Conclusions: Anaemia at the start of HAART is an important factor for short- and long-term prognosis, including in patients whose haemoglobin levels improved or normalized during the first 6 months of HAART

    Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: A systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    Background: The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods: We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings: Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation: Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding: Bill & Melinda Gates Foundation

    Non-AIDS defining cancers in the D:A:D Study-time trends and predictors of survival : a cohort study

    Get PDF
    BACKGROUND:Non-AIDS defining cancers (NADC) are an important cause of morbidity and mortality in HIV-positive individuals. Using data from a large international cohort of HIV-positive individuals, we described the incidence of NADC from 2004-2010, and described subsequent mortality and predictors of these.METHODS:Individuals were followed from 1st January 2004/enrolment in study, until the earliest of a new NADC, 1st February 2010, death or six months after the patient's last visit. Incidence rates were estimated for each year of follow-up, overall and stratified by gender, age and mode of HIV acquisition. Cumulative risk of mortality following NADC diagnosis was summarised using Kaplan-Meier methods, with follow-up for these analyses from the date of NADC diagnosis until the patient's death, 1st February 2010 or 6 months after the patient's last visit. Factors associated with mortality following NADC diagnosis were identified using multivariable Cox proportional hazards regression.RESULTS:Over 176,775 person-years (PY), 880 (2.1%) patients developed a new NADC (incidence: 4.98/1000PY [95% confidence interval 4.65, 5.31]). Over a third of these patients (327, 37.2%) had died by 1st February 2010. Time trends for lung cancer, anal cancer and Hodgkin's lymphoma were broadly consistent. Kaplan-Meier cumulative mortality estimates at 1, 3 and 5 years after NADC diagnosis were 28.2% [95% CI 25.1-31.2], 42.0% [38.2-45.8] and 47.3% [42.4-52.2], respectively. Significant predictors of poorer survival after diagnosis of NADC were lung cancer (compared to other cancer types), male gender, non-white ethnicity, and smoking status. Later year of diagnosis and higher CD4 count at NADC diagnosis were associated with improved survival. The incidence of NADC remained stable over the period 2004-2010 in this large observational cohort.CONCLUSIONS:The prognosis after diagnosis of NADC, in particular lung cancer and disseminated cancer, is poor but has improved somewhat over time. Modifiable risk factors, such as smoking and low CD4 counts, were associated with mortality following a diagnosis of NADC

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century
    corecore