9 research outputs found

    The utility of length of mining service and latency in predicting silicosis among claimants to a compensation trust

    Get PDF
    In the wake of a large burden of silicosis and tuberculosis among ex-miners from the South African gold mining industry, several programmes have been engaged in examining and compensating those at risk of these diseases. Availability of a database from one such programme, the Q(h)ubeka Trust, provided an opportunity to examine the accuracy of length of service in predicting compensable silicosis, and the concordance between self-reported employment and that officially recorded. Compensable silicosis was determined by expert panels, with ILO profusion ≥1/0 as the threshold for compensability. Age, officially recorded and self-reported years of service, and years since first and last service of 3146 claimants for compensable silicosis were analysed. Self-reported and recorded service were moderately correlated (R = 0.66, 95% confidence interval 0.64–0.68), with a Bland–Altman plot showing no systematic bias. There was reasonably high agreement with 75% of the differences being less than two years. Logistic regression and receiver operating characteristic curve analysis were used to test prediction of compensable silicosis. There was little predictive difference between length of service on its own and a model adjusting for length of service, age, and years since last exposure. Predictive accuracy was moderate, with significant potential misclassification. Twenty percent of claimants with compensable silicosis had a length of service <10 years; in almost all these claims, the interval between last exposure and the claim was 10 years or more. In conclusion, self-reported service length in the absence of an official service record could be accepted in claims with compatible clinical findings. Length of service offers, at best, moderate predictive capability for silicosis. Relatively short service compensable silicosis, when combined with at least 10 years since last exposure, was not uncommon.SUPPLEMENTARY MATERIAL : FIGURE S1. Receiver operating characteristic curve for length of service (years) as predictor against compensable silicosis as outcome.https://www.mdpi.com/journal/ijerphSchool of Health Systems and Public Health (SHSPH

    Examining evidence for a protective occupational exposure limit for inhalable manganese dust below which there are no adverse subclinical nervous system effects

    Get PDF
    Includes abstract.Includes bibliographical references (leaves 47-52).The aim of this study was to investigate the exposure-response relationship between the concentration of inhalable manganese dust and subclinical nervous system outcomes amongst workers in a South African manganese smelter with a view to identifying any exposure threshold below which adverse effects do not occur

    The Utility of Length of Mining Service and Latency in Predicting Silicosis among Claimants to a Compensation Trust

    Get PDF
    In the wake of a large burden of silicosis and tuberculosis among ex-miners from the South African gold mining industry, several programmes have been engaged in examining and compensating those at risk of these diseases. Availability of a database from one such programme, the Q(h)ubeka Trust, provided an opportunity to examine the accuracy of length of service in predicting compensable silicosis, and the concordance between self-reported employment and that officially recorded. Compensable silicosis was determined by expert panels, with ILO profusion ≥1/0 as the threshold for compensability. Age, officially recorded and self-reported years of service, and years since first and last service of 3146 claimants for compensable silicosis were analysed. Self-reported and recorded service were moderately correlated (R = 0.66, 95% confidence interval 0.64–0.68), with a Bland–Altman plot showing no systematic bias. There was reasonably high agreement with 75% of the differences being less than two years. Logistic regression and receiver operating characteristic curve analysis were used to test prediction of compensable silicosis. There was little predictive difference between length of service on its own and a model adjusting for length of service, age, and years since last exposure. Predictive accuracy was moderate, with significant potential misclassification. Twenty percent of claimants with compensable silicosis had a length of serviceMedicine, Faculty ofNon UBCPopulation and Public Health (SPPH), School ofReviewedFacultyResearche

    Exploration of the Burning Question: A Long History of Fire in Eastern Australia with and without People

    No full text
    Ethnographic observations suggest that Indigenous peoples employed a distinct regime of frequent, low-intensity fires in the Australian landscape in the past. However, the timing of this behaviour and its ecological impact remain uncertain. Here, we present detailed analysis of charcoal, including a novel measure of fire severity using Fourier transform infrared (FTIR) spectroscopy, at a site in eastern Australia that spans the last two glacial/interglacial transitions between 135–104 ka and 18–0.5 ka BP (broadly equivalent to Marine Isotope Stage (MIS) 6-5 and 2-1, respectively). The accumulation of charcoal and vegetation composition was similar across both periods, correlating closely with Antarctic ice core records, and suggesting that climate is the main driver of fire regimes. Fire severity was lower over the past 18,000 years compared to the penultimate glacial/interglacial period and suggests increasing anthropogenic influence over the landscape during this time. Together with local archaeological records, our data therefore imply that Indigenous peoples have been undertaking cultural burning since the beginning of the Holocene, and potentially the end of the Last Glacial Maximum. We highlight the fact that this signal is not easily discernible in the other proxies examined, including widely used charcoal techniques, and propose that any anthropogenic signal will be subtle in the palaeo-environmental record. While early Indigenous people’s reasons for landscape burning were different from those today, our findings nonetheless suggest that the current land management directions are based on a substantive history and could result in a reduction in extreme fire events

    Last Glacial Maximum cooling induced positive moisture balance and maintained stable human populations in Australia

    No full text
    Abstract Long-standing interpretations of the Last Glacial Maximum (21,000 ± 2000 years ago) in Australia suggest that the period was extremely cold and arid, during which the Indo-Australian summer monsoon system collapsed, and human populations declined and retreated to ecological refuges to survive. Here, we use transient iTRACE simulations, combined with palaeoclimate proxy records and archaeological data to re-interpret the late Last Glacial Maximum and terminal Pleistocene (21,000 – 11,000 years) in Australia. The model suggests climates during the peak Last Glacial Maximum were cooler than present (−4 to −11 °C), but there is no evidence of monsoon collapse or substantial decreases in moisture balance across Australia. Kernel Density Estimates of archaeological ages show relatively stable and persistent human activity across most regions throughout the late Last Glacial Maximum and terminal Pleistocene, consistent with genetic evidence. Spatial coverage of archaeological sites steadily increased across the terminal Pleistocene; however, substantial population change is not evident

    Assessment, endoscopy, and treatment in patients with acute severe ulcerative colitis during the COVID-19 pandemic (PROTECT-ASUC): a multicentre, observational, case-control study

    No full text
    BackgroundThere is a paucity of evidence to support safe and effective management of patients with acute severe ulcerative colitis during the COVID-19 pandemic. We sought to identify alterations to established conventional evidence-based management of acute severe ulcerative colitis during the early COVID-19 pandemic, the effect on outcomes, and any associations with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection and severe COVID-19 outcomes. MethodsThe PROTECT-ASUC study was a multicentre, observational, case-control study in 60 acute secondary care hospitals throughout the UK. We included adults (≥18 years) with either ulcerative colitis or inflammatory bowel disease unclassified, who presented with acute severe ulcerative colitis and fulfilled the Truelove and Witts criteria. Cases and controls were identified as either admitted or managed in emergency ambulatory care settings between March 1, 2020, and June 30, 2020 (COVID-19 pandemic period cohort), or between Jan 1, 2019, and June 30, 2019 (historical control cohort), respectively. The primary outcome was the proportion of patients with acute severe ulcerative colitis receiving rescue therapy (including primary induction) or colectomy. The study is registered with ClinicalTrials.gov, NCT04411784. FindingsWe included 782 patients (398 in the pandemic period cohort and 384 in the historical control cohort) who met the Truelove and Witts criteria for acute severe ulcerative colitis. The proportion of patients receiving rescue therapy (including primary induction) or surgery was higher during the pandemic period than in the historical period (217 [55%] of 393 patients vs 159 [42%] of 380 patients; p=0·00024) and the time to rescue therapy was shorter in the pandemic cohort than in the historical cohort (p=0·0026). This difference was driven by a greater use of rescue and primary induction therapies with biologicals, ciclosporin, or tofacitinib in the COVID-19 pandemic period cohort than in the historical control period cohort (177 [46%] of 387 patients in the COVID-19 cohort vs 134 [36%] of 373 patients in the historical cohort; p=0·0064). During the pandemic, more patients received ambulatory (outpatient) intravenous steroids (51 [13%] of 385 patients vs 19 [5%] of 360 patients; p=0·00023). Fewer patients received thiopurines (29 [7%] of 398 patients vs 46 [12%] of 384; p=0·029) and 5-aminosalicylic acids (67 [17%] of 398 patients vs 98 [26%] of 384; p=0·0037) during the pandemic than in the historical control period. Colectomy rates were similar between the pandemic and historical control groups (64 [16%] of 389 vs 50 [13%] of 375; p=0·26); however, laparoscopic surgery was less frequently performed during the pandemic period (34 [53%] of 64] vs 38 [76%] of 50; p=0·018). Five (2%) of 253 patients tested positive for SARS-CoV-2 during hospital treatment. Two (2%) of 103 patients re-tested for SARS-CoV-2 during the 3-month follow-up were positive 5 days and 12 days, respectively, after discharge from index admission. Both recovered without serious outcomes. InterpretationThe COVID-19 pandemic altered practice patterns of gastroenterologists and colorectal surgeons in the management of acute severe ulcerative colitis but was associated with similar outcomes to a historical cohort. Despite continued use of high-dose corticosteroids and biologicals, the incidence of COVID-19 within 3 months was low and not associated with adverse COVID-19 outcomes
    corecore