359 research outputs found

    The environmental impact of climate change adaptation on land use and water quality

    Get PDF
    Encouraging adaptation is an essential aspect of the policy response to climate change1. Adaptation seeks to reduce the harmful consequences and harness any beneficial opportunities arising from the changing climate. However, given that human activities are the main cause of environmental transformations worldwide2, it follows that adaptation itself also has the potential to generate further pressures, creating new threats for both local and global ecosystems. From this perspective, policies designed to encourage adaptation may conflict with regulation aimed at preserving or enhancing environmental quality. This aspect of adaptation has received relatively little consideration in either policy design or academic debate. To highlight this issue, we analyse the trade-offs between two fundamental ecosystem services that will be impacted by climate change: provisioning services derived from agriculture and regulating services in the form of freshwater quality. Results indicate that climate adaptation in the farming sector will generate fundamental changes in river water quality. In some areas, policies that encourage adaptation are expected to be in conflict with existing regulations aimed at improving freshwater ecosystems. These findings illustrate the importance of anticipating the wider impacts of human adaptation to climate change when designing environmental policies

    Minimizing the source of nociception and its concurrent effect on sensory hypersensitivity: An exploratory study in chronic whiplash patients

    Get PDF
    Abstract. Background. The cervical zygapophyseal joints may be a primary source of pain in up to 60% of individuals with chronic whiplash associated disorders (WAD) and may be a contributing factor for peripheral and centrally mediated pain (sensory hypersensitivity). Sensory hypersensitivity has been associated with a poor prognosis. The purpose of the study was to determine if there is a change in measures indicative of sensory hypersensitivity in patients with chronic WAD grade II following a medial branch block (MBB) procedure in the cervical spine. Methods. Measures of sensory hypersensitivity were taken via quantitative sensory testing (QST) consisting of pressure pain thresholds (PPT's) and cold pain thresholds (CPT's). In patients with chronic WAD (n = 18), the measures were taken at three sites bilaterally, pre- and post- MBB. Reduced pain thresholds at remote sites have been considered an indicator of central hypersensitivity. A healthy age and gender matched comparison group (n = 18) was measured at baseline. An independent t-test was applied to determine if there were any significant differences between the WAD and normative comparison groups at baseline with respect to cold pain and pressure pain thresholds. A dependent t-test was used to determine whether there were any significant differences between the pre and post intervention cold pain and pressure pain thresholds in the patients with chronic WAD. Results. At baseline, PPT's were decreased at all three sites in the WAD group (p < 0.001). Cold pain thresholds were increased in the cervical spine in the WAD group (p < 0.001). Post-MBB, the WAD group showed significant increases in PPT's at all sites (p < 0.05), and significant decreases in CPT's at the cervical spine (p < 0.001). Conclusions. The patients with chronic WAD showed evidence of widespread sensory hypersensitivity to mechanical and thermal stimuli. The WAD group revealed decreased sensory hypersensitivity following a decrease in their primary source of pain stemming from the cervical zygapophyseal joints

    Thoracic dysfunction in whiplash associated disorders: A systematic review

    Get PDF
    © 2018 Heneghan et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Background Research investigating Whiplash Associated Disorder (WAD) has largely focused on the cervical spine yet symptoms can be widespread. Thoracic spine pain prevalence is reported ~66%; perhaps unsurprising given the forceful stretch/eccentric loading of posterior structures of the spine, and the thoracic spine’s contribution to neck mobility/function. Approximately 50% WAD patients develop chronic pain and disability resulting in high levels of societal and healthcare costs. It is time to look beyond the cervical spine to fully understand anatomical dysfunction in WAD and provide new directions for clinical practice and research. Purpose To evaluate the scope and nature of dysfunction in the thoracic region in patients with WAD. Methods A systematic review and data synthesis was conducted according to a pre-defined, registered (PROSPERO, CRD42015026983) and published protocol. All forms of observational study were included. A sensitive topic-based search strategy was designed from inception to 1/06/16. Databases, grey literature and registers were searched using a study population terms and key words derived from scoping search. Two reviewers independently searched information sources, assessed studies for inclusion, extracted data and assessed risk of bias. A third reviewer checked for consistency and clarity. Extracted data included summary data: sample size and characteristics, outcomes, and timescales to reflect disorder state. Risk of bias was assessed using the Newcastle-Ottawa Scale. Data were tabulated to allow enabling a semi-qualitative comparison and grouped by outcome across studies. Strength of the overall body of evidence was assessed using a modified GRADE. Results Thirty eight studies (n>50,000) which were conducted across a range of countries were included. Few authors responded to requests for further data (5 of 9 contacted). Results were reported in the context of overall quality and were presented for measures of pain or dysfunction and presented, where possible, according to WAD severity and time point post injury. Key findings include: 1) high prevalence of thoracic pain (>60%); higher for those with more severe presentations and in the acute stage, 2) low prevalence of chest pain

    MassBuilt: effectiveness of an apprenticeship site-based smoking cessation intervention for unionized building trades workers

    Get PDF
    Blue-collar workers are difficult to reach and less likely to successfully quit smoking. The objective of this study was to test a training site-based smoking cessation intervention. This study is a randomized-controlled trial of a smoking cessation intervention that integrated occupational health concerns and was delivered in collaboration with unions to apprentices at 10 sites (n = 1,213). We evaluated smoking cessation at 1 and 6 months post-intervention. The baseline prevalence of smoking was 41%. We observed significantly higher quit rates in the intervention versus control group (26% vs. 16.8%; p = 0.014) 1 month after the intervention. However, the effects diminished over time so that the difference in quit rate was not significant at 6 month post-intervention (9% vs. 7.2%; p = 0.48). Intervention group members nevertheless reported a significant decrease in smoking intensity (OR = 3.13; 95% CI: 1.55–6.31) at 6 months post-intervention, compared to controls. The study demonstrates the feasibility of delivering an intervention through union apprentice programs. Furthermore, the notably better 1-month quit rate results among intervention members and the greater decrease in smoking intensity among intervention members who continued to smoke underscore the need to develop strategies to help reduce relapse among blue-collar workers who quit smoking

    Survival of HIV-positive patients starting antiretroviral therapy between 1996 and 2013: a collaborative analysis of cohort studies

    Get PDF
    BACKGROUND: Health care for people living with HIV has improved substantially in the past two decades. Robust estimates of how these improvements have affected prognosis and life expectancy are of utmost importance to patients, clinicians, and health-care planners. We examined changes in 3 year survival and life expectancy of patients starting combination antiretroviral therapy (ART) between 1996 and 2013. METHODS: We analysed data from 18 European and North American HIV-1 cohorts. Patients (aged ≥16 years) were eligible for this analysis if they had started ART with three or more drugs between 1996 and 2010 and had at least 3 years of potential follow-up. We estimated adjusted (for age, sex, AIDS, risk group, CD4 cell count, and HIV-1 RNA at start of ART) all-cause and cause-specific mortality hazard ratios (HRs) for the first year after ART initiation and the second and third years after ART initiation in four calendar periods (1996–99, 2000–03 [comparator], 2004–07, 2008–10). We estimated life expectancy by calendar period of initiation of ART. FINDINGS: 88 504 patients were included in our analyses, of whom 2106 died during the first year of ART and 2302 died during the second or third year of ART. Patients starting ART in 2008–10 had lower all-cause mortality in the first year after ART initiation than did patients starting ART in 2000–03 (adjusted HR 0·71, 95% CI 0·61–0·83). All-cause mortality in the second and third years after initiation of ART was also lower in patients who started ART in 2008–10 than in those who started in 2000–03 (0·57, 0·49–0·67); this decrease was not fully explained by viral load and CD4 cell count at 1 year. Rates of non-AIDS deaths were lower in patients who started ART in 2008–10 (vs 2000–03) in the first year (0·48, 0·34–0·67) and second and third years (0·29, 0·21–0·40) after initiation of ART. Between 1996 and 2010, life expectancy in 20-year-old patients starting ART increased by about 9 years in women and 10 years in men. INTERPRETATION: Even in the late ART era, survival during the first 3 years of ART continues to improve, which probably reflects transition to less toxic antiretroviral drugs, improved adherence, prophylactic measures, and management of comorbidity. Prognostic models and life expectancy estimates should be updated to account for these improvements

    A Frameshift Mutation in Golden Retriever Dogs with Progressive Retinal Atrophy Endorses SLC4A3 as a Candidate Gene for Human Retinal Degenerations

    Get PDF
    Progressive retinal atrophy (PRA) in dogs, the canine equivalent of retinitis pigmentosa (RP) in humans, is characterised by vision loss due to degeneration of the photoreceptor cells in the retina, eventually leading to complete blindness. It affects more than 100 dog breeds, and is caused by numerous mutations. RP affects 1 in 4000 people in the Western world and 70% of causal mutations remain unknown. Canine diseases are natural models for the study of human diseases and are becoming increasingly useful for the development of therapies in humans. One variant, prcd-PRA, only accounts for a small proportion of PRA cases in the Golden Retriever (GR) breed. Using genome-wide association with 27 cases and 19 controls we identified a novel PRA locus on CFA37 (praw = 1.94×10−10, pgenome = 1.0×10−5), where a 644 kb region was homozygous within cases. A frameshift mutation was identified in a solute carrier anion exchanger gene (SLC4A3) located within this region. This variant was present in 56% of PRA cases and 87% of obligate carriers, and displayed a recessive mode of inheritance with full penetrance within those lineages in which it segregated. Allele frequencies are approximately 4% in the UK, 6% in Sweden and 2% in France, but the variant has not been found in GRs from the US. A large proportion of cases (approximately 44%) remain unexplained, indicating that PRA in this breed is genetically heterogeneous and caused by at least three mutations. SLC4A3 is important for retinal function and has not previously been associated with spontaneously occurring retinal degenerations in any other species, including humans

    Associations of modern initial antiretroviral drug regimens with all-cause mortality in adults with HIV in Europe and North America: a cohort study

    Get PDF
    Background: Over the past decade, antiretroviral therapy (ART) regimens that include integrase strand inhibitors (INSTIs) have become the most commonly used for people with HIV starting ART. Although trials and observational studies have compared virological failure on INSTI-based with other regimens, few data are available on mortality in people with HIV treated with INSTIs in routine care. Therefore, we compared all-cause mortality between different INSTI-based and non-INSTI-based regimens in adults with HIV starting ART from 2013 to 2018. Methods: This cohort study used data on people with HIV in Europe and North America from the Antiretroviral Therapy Cohort Collaboration (ART-CC) and UK Collaborative HIV Cohort (UK CHIC). We studied the most common third antiretroviral drugs (additional to nucleoside reverse transcriptase inhibitor) used from 2013 to 2018: rilpivirine, darunavir, raltegravir, elvitegravir, dolutegravir, efavirenz, and others. Adjusted hazard ratios (aHRs; adjusted for clinical and demographic characteristics, comorbid conditions, and other drugs in the regimen) for mortality were estimated using Cox models stratified by ART start year and cohort, with multiple imputation of missing data. Findings: 62 500 ART-naive people with HIV starting ART (12 422 [19·9%] women; median age 38 [IQR 30–48]) were included in the study. 1243 (2·0%) died during 188 952 person-years of follow-up (median 3·0 years [IQR 1·6–4·4]). There was little evidence that mortality rates differed between regimens with dolutegravir, elvitegravir, rilpivirine, darunavir, or efavirenz as the third drug. However, mortality was higher for raltegravir compared with dolutegravir (aHR 1·49, 95% CI 1·15–1·94), elvitegravir (1·86, 1·43–2·42), rilpivirine (1·99, 1·49–2·66), darunavir (1·62, 1·33–1·98), and efavirenz (2·12, 1·60–2·81) regimens. Results were similar for analyses making different assumptions about missing data and consistent across the time periods 2013–15 and 2016–18. Rates of virological suppression were higher for dolutegravir than other third drugs. Interpretation: This large study of patients starting ART since the introduction of INSTIs found little evidence that mortality rates differed between most first-line ART regimens; however, raltegravir-based regimens were associated with higher mortality. Although unmeasured confounding cannot be excluded as an explanation for our findings, virological benefits of first-line INSTIs-based ART might not translate to differences in mortality. Funding: US National Institute on Alcohol Abuse and Alcoholism and UK Medical Research Council

    Novel Use of Surveillance Data to Detect HIV-Infected Persons with Sustained High Viral Load and Durable Virologic Suppression in New York City

    Get PDF
    Background: Monitoring of the uptake and efficacy of ART in a population often relies on cross-sectional data, providing limited information that could be used to design specific targeted intervention programs. Using repeated measures of viral load (VL) surveillance data, we aimed to estimate and characterize the proportion of persons living with HIV/AIDS (PLWHA) in New York City (NYC) with sustained high VL (SHVL) and durably suppressed VL (DSVL). Methods/Principal Findings: Retrospective cohort study of all persons reported to the NYC HIV Surveillance Registry who were alive and 12yearsoldbytheendof2005andwhohad12 years old by the end of 2005 and who had 2 VL tests in 2006 and 2007. SHVL and DSVL were defined as PLWHA with 2 consecutive VLs $100,000 copies/mL and PLWHA with all VLs #400 copies/mL, respectively. Logistic regression models using generalized estimating equations were used to model the association between SHVL and covariates. There were 56,836 PLWHA, of whom 7 % had SHVL and 38 % had DSVL. Compared to those without SHVL, persons with SHVL were more likely to be younger, black and have injection drug use (IDU) risk. PLWHA with SHVL were more likely to die by 2007 and be younger by nearly ten years, on average. Conclusions/Significance: Nearly 60 % of PLWHA in 2005 had multiple VLs, of whom almost 40 % had DSVL, suggesting successful ART uptake. A small proportion had SHVL, representing groups known to have suboptimal engagement in care. This group should be targeted for additional outreach to reduce morbidity and secondary transmission. Measures based o

    Maternal microchimerism in the livers of patients with Biliary atresia

    Get PDF
    BACKGROUND: Biliary atresia (BA) is a neonatal cholestatic disease of unknown etiology. It is the leading cause of liver transplantation in children. Many similarities exist between BA and graft versus host disease suggesting engraftment of maternal cells during gestation could result in immune responses that lead to BA. The aim of this study was to determine the presence and extent of maternal microchimerism (MM) in the livers of infants with BA. METHODS: Using fluorescent in situ hybridization (FISH), 11 male BA & 4 male neonatal hepatitis (NH) livers, which served as controls, were analyzed for X and Y-chromosomes. To further investigate MM in BA, 3 patients with BA, and their mothers, were HLA typed. Using immunohistochemical stains, the BA livers were examined for MM. Four additional BA livers underwent analysis by polymerase chain reaction (PCR) for evidence of MM. RESULTS: By FISH, 8 BA and 2 NH livers were interpretable. Seven of eight BA specimens showed evidence of MM. The number of maternal cells ranged from 2–4 maternal cells per biopsy slide. Neither NH specimen showed evidence of MM. In addition, immunohistochemical stains confirmed evidence of MM. Using PCR, a range of 1–142 copies of maternal DNA per 25,000 copies of patients DNA was found. CONCLUSIONS: Maternal microchimerism is present in the livers of patients with BA and may contribute to the pathogenesis of BA
    corecore