714 research outputs found

    Community Perspective on Consultation on Urban Stormwater Management: Lessons from Brownhill Creek, South Australia

    Get PDF
    This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).There are salutary lessons from contrasting community consultation efforts in 2011 and 2015 to develop and gain support for an urban stormwater management plan for the Brownhill Creek catchment in Adelaide, South Australia. The 2011 process was a failure in the human dimension, precipitating loss of community confidence, unnecessarily entrained thousands of hours of time of residents who initiated a community action group for environmental conservation and caused a three-year delay to decision making. By contrast, the 2015 process was vastly improved, resulted in a landslide level of support for an obvious option not previously offered, achieved the required level of flood protection, saved Aus$5 million (14%) on the previously proposed option and protected a highly valued natural environment from an unnecessary dam. This paper presents a rarely heard perspective on these community consultation processes from a participating community environmental and heritage conservation action group (the Brownhill Creek Association) that was deeply engaged in reforming the Draft Brown Hill Keswick Creek Stormwater Management Plan. This reveals that the community needs to see that all options are considered and to have access to accurate information with which to assess them. It is also necessary that the proposed plan is consistent with existing agreed plans and policies developed through public consultation. Community concerns need to be heard, acknowledged and acted upon or responded to, and the consultation process needs to be transparently fair and democratic to win community support. A major contributor to success in the second consultation was that all community action groups were invited to meetings to discuss the purpose of the consultation and the methods to be used. Feedback was subsequently received before the process commenced to show what had changed and why any suggestions concerning the consultation process were not being adopted. This openness helped to mend the distrust of the first consultation process and is recommended as an essential early step in any public consultation process

    Uptake of hepatitis C specialist services and treatment following diagnosis by dried blood spot in Scotland

    Get PDF
    Background: Dried blood spot (DBS) testing for hepatitis C (HCV) was introduced to Scotland in 2009. This minimally invasive specimen provides an alternative to venipuncture and can overcome barriers to testing in people who inject drugs (PWID). Objectives: The objective of this study was to determine rates and predictors of: exposure to HCV, attendance at specialist clinics and anti-viral treatment initiation among the DBS tested population in Scotland. Study design: DBS testing records were deterministically linked to the Scottish HCV Clinical database prior to logistic regression analysis. Results: In the first two years of usage in Scotland, 1322 individuals were tested by DBS of which 476 were found to have an active HCV infection. Linkage analysis showed that 32% had attended a specialist clinic within 12 months of their specimen collection date and 18% had begun anti-viral therapy within 18 months of their specimen collection date. A significantly reduced likelihood of attendance at a specialist clinic was evident amongst younger individuals (<35 years), those of unknown ethnic origin and those not reporting injecting drug use as a risk factor. Conclusion: We conclude that DBS testing in non-clinical settings has the potential to increase diagnosis and, with sufficient support, treatment of HCV infection among PWID

    Energy demand on dairy farms in Ireland

    Get PDF
    peer-reviewedReducing electricity consumption in Irish milk production is a topical issue for 2 reasons. First, the introduction of a dynamic electricity pricing system, with peak and off-peak prices, will be a reality for 80% of electricity consumers by 2020. The proposed pricing schedule intends to discourage energy consumption during peak periods (i.e., when electricity demand on the national grid is high) and to incentivize energy consumption during off-peak periods. If farmers, for example, carry out their evening milking during the peak period, energy costs may increase, which would affect farm profitability. Second, electricity consumption is identified in contributing to about 25% of energy use along the life cycle of pasture-based milk. The objectives of this study, therefore, were to document electricity use per kilogram of milk sold and to identify strategies that reduce its overall use while maximizing its use in off-peak periods (currently from 0000 to 0900h). We assessed, therefore, average daily and seasonal trends in electricity consumption on 22 Irish dairy farms, through detailed auditing of electricity-consuming processes. To determine the potential of identified strategies to save energy, we also assessed total energy use of Irish milk, which is the sum of the direct (i.e., energy use on farm) and indirect energy use (i.e., energy needed to produce farm inputs). On average, a total of 31.73 MJ was required to produce 1kg of milk solids, of which 20% was direct and 80% was indirect energy use. Electricity accounted for 60% of the direct energy use, and mainly resulted from milk cooling (31%), water heating (23%), and milking (20%). Analysis of trends in electricity consumption revealed that 62% of daily electricity was used at peak periods. Electricity use on Irish dairy farms, therefore, is substantial and centered around milk harvesting. To improve the competitiveness of milk production in a dynamic electricity pricing environment, therefore, management changes and technologies are required that decouple energy use during milking processes from peak periods

    Year-round patterns in bacterial production and biomass in Lake Simcoe, Ontario, Canada: are heterotrophic bacteria a significant contributor to low hypolimnetic oxygen?

    Get PDF
    Bacteria serve an important function in aquatic environments and are associated with nutrient regeneration, carbon (C) metabolism, and secondary production. This is the first study to measure heterotrophic bacterial production, abundance, biomass, and biovolume in Lake Simcoe, Ontario. Excessive phosphorus loading resulting in low hypolimnetic oxygen concentrations has impacted the cold-water fishery in Lake Simcoe. We tested the hypothesis that bacteria contributed to dissolved oxygen declines in Lake Simcoe and examined the environmental factors impacting bacterial activity. Spatial and temporal variations in the microbial community were measured from June 2010 to July 2011. A dual-isotope method (3H-TdR and 14C-leu) was used, resulting in mean (± standard deviation) annual epilimnetic bacterial production estimates of 0.130 ± 0.173 and 0.268 ± 0.304 µg C L−1 h−1, respectively. The mean annual bacterial abundance was 1.49 ± 1.53 cells × 109 L−1, with a mean biomass of 1.21 ± 1.34 µg C L−1 and a mean biovolume of 0.0043 ± 0.0030 µm3 cell−1. These estimates had distinct seasonal patterns, with consistently lower bacterial activity in the winter relative to the spring, summer, and fall. Differences between epilimnetic and hypolimnetic bacterial activity were inconsistent for the 4 bacterial parameters measured. Lake temperature, chlorophyll a, and dissolved organic C concentrations were the most significant factors influencing the annual epilimnetic patterns in the examined bacterial parameters. Annual bacterial production was low in Lake Simcoe and does not seem to be a major contributor to the low hypolimnetic oxygen concentrations in the lake

    Evidence of continued injecting drug use after attaining sustained treatment-induced clearance of the hepatitis C virus: implications for reinfection

    Get PDF
    Background: People who inject drugs (PWID) are at the greatest risk of hepatitis C virus (HCV) infection, yet are often denied immediate treatment due to fears of on-going risk behaviour. Our principal objective was to examine evidence of continued injecting drug use among PWID following successful treatment for HCV and attainment of a sustained viral response (SVR). Methods: PWID who attained SVR between 1992 and June 2012 were selected from the National Scottish Hepatitis C Clinical Database. Hospitalisation and mortality records were sourced for these patients using record linkage techniques. Our primary outcome variable was any hospitalisation or death, which was indicative of injecting drugs post-SVR. Results: The cohort comprised 1170 PWID (mean age at SVR 39.6y; 76% male). The Kaplan Meier estimate of incurring the primary outcome after three years of SVR was 10.59% (95% CI, 8.75–12.79) After adjusting for confounding, the risk of an injection related hospital episode or death post-SVR was significantly increased with advancing year of SVR: AHR:1.07 per year (95% CI, 1.01–1.14), having a pre-SVR acute alcohol intoxication-related hospital episode: AHR:1.83 (95% CI, 1.29–2.60), and having a pre-SVR opiate or injection-related hospital episode: AHR:2.59 (95% CI, 1.84–3.64). Conclusion: Despite attaining the optimal treatment outcome, these data indicate that an increasing significant minority of PWID continue to inject post-SVR at an intensity which leads to either hospitalisation or death and increased risk of reinfection

    Prediction of liver disease in patients whose liver function tests have been checked in primary care : model development and validation using population-based observational cohorts

    Get PDF
    This work was supported by the UK National Health Service Research & Development Programme Health Technology Assessment Programme (project number 03/38/02) and also by the Backett Weir Russell Career Development Fellowship, University of Aberdeen.OBJECTIVE: To derive and validate a clinical prediction model to estimate the risk of liver disease diagnosis following liver function tests (LFTs) and to convert the model to a simplified scoring tool for use in primary care. DESIGN: Population-based observational cohort study of patients in Tayside Scotland identified as having their LFTs performed in primary care and followed for 2 years. Biochemistry data were linked to secondary care, prescriptions and mortality data to ascertain baseline characteristics of the derivation cohort. A separate validation cohort was obtained from 19 general practices across the rest of Scotland to externally validate the final model. SETTING: Primary care, Tayside, Scotland. PARTICIPANTS: Derivation cohort: LFT results from 310 511 patients. After exclusions (including: patients under 16 years, patients having initial LFTs measured in secondary care, bilirubin >35 μmol/L, liver complications within 6 weeks and history of a liver condition), the derivation cohort contained 95 977 patients with no clinically apparent liver condition. Validation cohort: after exclusions, this cohort contained 11 653 patients. PRIMARY AND SECONDARY OUTCOME MEASURES: Diagnosis of a liver condition within 2 years. RESULTS: From the derivation cohort (n=95 977), 481 (0.5%) were diagnosed with a liver disease. The model showed good discrimination (C-statistic=0.78). Given the low prevalence of liver disease, the negative predictive values were high. Positive predictive values were low but rose to 20-30% for high-risk patients. CONCLUSIONS: This study successfully developed and validated a clinical prediction model and subsequent scoring tool, the Algorithm for Liver Function Investigations (ALFI), which can predict liver disease risk in patients with no clinically obvious liver disease who had their initial LFTs taken in primary care. ALFI can help general practitioners focus referral on a small subset of patients with higher predicted risk while continuing to address modifiable liver disease risk factors in those at lower risk.Publisher PDFPeer reviewe

    A transient positive association between direct-acting antiviral therapy for hepatitis C infection and drug-related hospitalization among people who inject drugs: self-controlled case-series analysis of national data

    Get PDF
    Background and Aims: Direct-acting antiviral (DAA) treatment has an established positive effect on liver outcomes in people with hepatitis C infection; however, there is insufficient evidence regarding its effects on the 'extra-hepatic' outcomes of drug-related hospitalization and mortality (DRM) among people who inject drugs (PWID). We investigated associations between these outcomes and DAA treatment by comparing post-treatment to baseline periods using a within-subjects design to minimize selection bias concerns with cohort or case-control designs.Design: This was a self-controlled case-series study.Setting: Scotland, 1 January 2015-30 November 2020.Participants: The study population of non-cirrhotic, DAA-treated PWID was identified using a data set linking Scotland's hepatitis C diagnosis, HCV clinical databases, national inpatient/day-case hospital records and the national deaths register. Three principal outcomes (drug overdose admission, non-viral injecting related admission and drug-related mortality) were defined using ICD codes.Measurements: Self-controlled case-series methodology was used to estimate the relative incidence (RI) of each outcome associated with time on treatment and up to six 90-day exposure risk periods thereafter.Findings: A total of 6050 PWID were treated with DAAs in the sampling time-frame. Compared with the baseline period, there was a significantly lowered risk of a drug overdose hospital admission in the second to fifth exposure risk periods only [relative incidence (RI) = 0.86, 95% confidence interval (CI) = 0.80-0.99; 0.89, 95% CI = 0.80-0.99; 0.86, 95% CI = 0.77-0.96; 0.88, 95% CI = 0.78-0.99, respectively]. For non-viral injecting-related admission, there was a reduced risk in the first, third and fourth exposure risk periods (RI = 0.76, 95% CI = 0.64-0.90; 0.75, 95% CI = 0.62-0.90; 0.79, 95% CI = 0.66-0.96, respectively). There was no evidence for reduced DRM risk in any period following treatment end.Conclusions: Among people who inject drugs in Scotland, direct-acting antiviral treatment appears to be associated with a small, non-durable reduction in the risk of drug-related hospital admission, but not drug-related mortality. Direct-acting antiviral therapy, despite high effectiveness against liver disease, does not appear to offer a panacea for reducing other drug-related health harms.</p
    • …
    corecore