61 research outputs found

    Modelling the impact of improving screening and treatment of chronic hepatitis C virus infection on future hepatocellular carcinoma rates and liver-related mortality.

    Get PDF
    BACKGROUND: The societal, clinical and economic burden imposed by the complications of chronic hepatitis C virus (HCV) infection - including cirrhosis and hepatocellular carcinoma (HCC) - is expected to increase over the coming decades. However, new therapies may improve sustained virological response (SVR) rates and shorten treatment duration. This study aimed to estimate the future burden of HCV-related disease in England if current management strategies remain the same and the impact of increasing diagnosis and treatment of HCV as new therapies become available. METHODS: A previously published model was adapted for England using published literature and government reports, and validated through an iterative process of three meetings of HCV experts. The impact of increasing diagnosis and treatment of HCV as new therapies become available was modelled and compared to the base-case scenario of continuing current management strategies. To assess the 'best case' clinical benefit of new therapies, the number of patients treated was increased by a total of 115% by 2018. RESULTS: In the base-case scenario, total viraemic (HCV RNA-positive) cases of HCV in England will decrease from 144,000 in 2013 to 76,300 in 2030. However, due to the slow progression of chronic HCV, the number of individuals with cirrhosis, decompensated cirrhosis and HCC will continue to increase over this period. The model suggests that the 'best case' substantially reduces HCV-related hepatic disease and HCV-related liver mortality by 2020 compared to the base-case scenario. The number of HCV-related HCC cases would decrease 50% by 2020 and the number progressing from infection to decompensated cirrhosis would decline by 65%. Therefore, compared to projections of current practices, increasing treatment numbers by 115% by 2018 would reduce HCV-related mortality by 50% by 2020. CONCLUSIONS: This analysis suggests that with current treatment practices the number of patients developing HCV-related cirrhosis, decompensated cirrhosis and HCC will increase substantially, with HCV-related liver deaths likely to double by 2030. However, increasing diagnosis and treatment rates could optimise the reduction in the burden of disease produced by the new therapies, potentially halving HCV-related liver mortality and HCV-related HCC by 2020

    Contrast in Edge Vegetation Structure Modifies the Predation Risk of Natural Ground Nests in an Agricultural Landscape

    Get PDF
    Nest predation risk generally increases nearer forest-field edges in agricultural landscapes. However, few studies test whether differences in edge contrast (i.e. hard versus soft edges based on vegetation structure and height) affect edge-related predation patterns and if such patterns are related to changes in nest conspicuousness between incubation and nestling feeding. Using data on 923 nesting attempts we analyse factors influencing nest predation risk at different edge types in an agricultural landscape of a ground-cavity breeding bird species, the Northern Wheatear (Oenanthe oenanthe). As for many other bird species, nest predation is a major determinant of reproductive success in this migratory passerine. Nest predation risk was higher closer to woodland and crop field edges, but only when these were hard edges in terms of ground vegetation structure (clear contrast between tall vs short ground vegetation). No such edge effect was observed at soft edges where adjacent habitats had tall ground vegetation (crop, ungrazed grassland). This edge effect on nest predation risk was evident during the incubation stage but not the nestling feeding stage. Since wheatear nests are depredated by ground-living animals our results demonstrate: (i) that edge effects depend on edge contrast, (ii) that edge-related nest predation patterns vary across the breeding period probably resulting from changes in parental activity at the nest between the incubation and nestling feeding stage. Edge effects should be put in the context of the nest predator community as illustrated by the elevated nest predation risk at hard but not soft habitat edges when an edge is defined in terms of ground vegetation. These results thus can potentially explain previously observed variations in edge-related nest predation risk

    Low Variation in the Polymorphic Clock Gene Poly-Q Region Despite Population Genetic Structure across Barn Swallow (Hirundo rustica) Populations

    Get PDF
    Recent studies of several species have reported a latitudinal cline in the circadian clock gene, Clock, which influences rhythms in both physiology and behavior. Latitudinal variation in this gene may hence reflect local adaptation to seasonal variation. In some bird populations, there is also an among-individual association between Clock poly-Q genotype and clutch initiation date and incubation period. We examined Clock poly-Q allele variation in the Barn Swallow (Hirundo rustica), a species with a cosmopolitan geographic distribution and considerable variation in life-history traits that may be influenced by the circadian clock. We genotyped Barn Swallows from five populations (from three subspecies) and compared variation at the Clock locus to that at microsatellite loci and mitochondrial DNA (mtDNA). We found very low variation in the Clock poly-Q region, as >96% of individuals were homozygous, and the two other alleles at this locus were globally rare. Genetic differentiation based on the Clock poly-Q locus was not correlated with genetic differentiation based on either microsatellite loci or mtDNA sequences. Our results show that high diversity in Clock poly-Q is not general across avian species. The low Clock variation in the background of heterogeneity in microsatellite and mtDNA loci in Barn Swallows may be an outcome of stabilizing selection on the Clock locus

    The adoption of pottery on Kodiak Island: Insights from organic residue analysis

    Get PDF
    Pottery technology, originating in Northeast Asia, appeared in Alaska some 2800 years ago. It spread swiftly along Alaska’s coastline but was not adopted on Kodiak Island until around 500 cal BP, as part of the Koniag tradition. While in the southeast pottery was used extensively, people on the northern half of the island did not adopt the technology. What drove these patterns of adoption and non-adoption on Kodiak Island? To better understand the role of ceramic technology in the Koniag tradition we used organic residue analysis to investigate pottery function. Results indicate that pottery was used to process aquatic resources, including anadromous fish, but especially marine species. Based on archaeological and ethnographic data, and spatial analysis of pottery distributions and function, we hypothesize that Koniag pottery was a tool inherent to the rendering of whale oil on the southeast coast of Kodiak Island, supporting previous suggestions by Knecht (1995) and Fitzhugh (2001). When viewed in the broader historical context of major technological and social transformations, we conclude that social identity and cultural boundaries may also have played a role in the delayed and partial adoption of pottery on Kodiak Island

    Dairying, diseases and the evolution of lactase persistence in Europe

    Get PDF
    Update notice Author Correction: Dairying, diseases and the evolution of lactase persistence in Europe (Nature, (2022), 608, 7922, (336-345), 10.1038/s41586-022-05010-7) Nature, Volume 609, Issue 7927, Pages E9, 15 September 2022In European and many African, Middle Eastern and southern Asian populations, lactase persistence (LP) is the most strongly selected monogenic trait to have evolved over the past 10,000 years(1). Although the selection of LP and the consumption of prehistoric milk must be linked, considerable uncertainty remains concerning their spatiotemporal configuration and specific interactions(2,3). Here we provide detailed distributions of milk exploitation across Europe over the past 9,000 years using around 7,000 pottery fat residues from more than 550 archaeological sites. European milk use was widespread from the Neolithic period onwards but varied spatially and temporally in intensity. Notably, LP selection varying with levels of prehistoric milk exploitation is no better at explaining LP allele frequency trajectoriesthan uniform selection since the Neolithic period. In the UK Biobank(4,5) cohort of 500,000 contemporary Europeans, LP genotype was only weakly associated with milk consumption and did not show consistent associations with improved fitness or health indicators. This suggests that other reasons for the beneficial effects of LP should be considered for its rapid frequency increase. We propose that lactase non-persistent individuals consumed milk when it became available but, under conditions of famine and/or increased pathogen exposure, this was disadvantageous, driving LP selection in prehistoric Europe. Comparison of model likelihoods indicates that population fluctuations, settlement density and wild animal exploitation-proxies for these drivers-provide better explanations of LP selection than the extent of milk exploitation. These findings offer new perspectives on prehistoric milk exploitation and LP evolution.Peer reviewe

    Global prevalence and genotype distribution of hepatitis C virus infection in 2015 : A modelling study

    Get PDF
    Publisher Copyright: © 2017 Elsevier LtdBackground The 69th World Health Assembly approved the Global Health Sector Strategy to eliminate hepatitis C virus (HCV) infection by 2030, which can become a reality with the recent launch of direct acting antiviral therapies. Reliable disease burden estimates are required for national strategies. This analysis estimates the global prevalence of viraemic HCV at the end of 2015, an update of—and expansion on—the 2014 analysis, which reported 80 million (95% CI 64–103) viraemic infections in 2013. Methods We developed country-level disease burden models following a systematic review of HCV prevalence (number of studies, n=6754) and genotype (n=11 342) studies published after 2013. A Delphi process was used to gain country expert consensus and validate inputs. Published estimates alone were used for countries where expert panel meetings could not be scheduled. Global prevalence was estimated using regional averages for countries without data. Findings Models were built for 100 countries, 59 of which were approved by country experts, with the remaining 41 estimated using published data alone. The remaining countries had insufficient data to create a model. The global prevalence of viraemic HCV is estimated to be 1·0% (95% uncertainty interval 0·8–1·1) in 2015, corresponding to 71·1 million (62·5–79·4) viraemic infections. Genotypes 1 and 3 were the most common cause of infections (44% and 25%, respectively). Interpretation The global estimate of viraemic infections is lower than previous estimates, largely due to more recent (lower) prevalence estimates in Africa. Additionally, increased mortality due to liver-related causes and an ageing population may have contributed to a reduction in infections. Funding John C Martin Foundation.publishersversionPeer reviewe

    The Liverpool alcohol-related liver disease algorithm identifies twice as many emergency admissions compared to standard methods when applied to Hospital Episode Statistics for England

    Get PDF
    BackgroundEmergency admissions in England for alcohol-related liver disease (ArLD) have increased steadily for decades. Statistics based on administrative data typically focus on the ArLD-specific code as the primary diagnosis and are therefore at risk of excluding ArLD admissions defined by other coding combinations.AimTo deploy the Liverpool ArLD Algorithm (LAA), which accounts for alternative coding patterns (e.g., ArLD secondary diagnosis with alcohol/liver-related primary diagnosis), to national and local datasets in the context of studying trends in ArLD admissions before and during the COVID-19 pandemic.MethodsWe applied the standard approach and LAA to Hospital Episode Statistics for England (2013-21). The algorithm was also deployed at 28 hospitals to discharge coding for emergency admissions during a common 7-day period in 2019 and 2020, in which eligible patient records were reviewed manually to verify the diagnosis and extract data.ResultsNationally, LAA identified approximately 100% more monthly emergency admissions from 2013 to 2021 than the standard method. The annual number of ArLD-specific admissions increased by 30.4%. Of 39,667 admissions in 2020/21, only 19,949 were identified with standard approach, an estimated admission cost of £70 million in under-recorded cases. Within 28 local hospital datasets, 233 admissions were identified using the standard approach and a further 250 locally verified cases using the LAA (107% uplift). There was an 18% absolute increase in ArLD admissions in the seven-day evaluation period in 2020 versus 2019. There were no differences in disease severity or mortality, or in the proportion of admissions with decompensation of cirrhosis or alcoholic hepatitis.ConclusionsThe LAA can be applied successfully to local and national datasets. It consistently identifies approximately 100% more cases than the standard coding approach. The algorithm has revealed the true extent of ArLD admissions. The pandemic has compounded a long-term rise in ArLD admissions and mortality

    Scenario-led habitat modelling of land use change impacts on key species

    Get PDF
    © 2015 Gearyet al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Accurate predictions of the impacts of future land use change on species of conservation concern can help to inform policy-makers and improve conservation measures. If predictions are spatially explicit, predicted consequences of likely land use changes could be accessible to land managers at a scale relevant to their working landscape. We introduce a method, based on open source software, which integrates habitat suitability modelling with scenario-building, and illustrate its use by investigating the effects of alternative land use change scenarios on landscape suitability for black grouse Tetrao tetrix. Expert opinion was used to construct five near-future (twenty years) scenarios for the 800 km 2 study site in upland Scotland. For each scenario, the cover of different land use types was altered by 5-30% from 20 random starting locations and changes in habitat suitability assessed by projecting a MaxEnt suitability model onto each simulated landscape. A scenario converting grazed land to moorland and open forestry was the most beneficial for black grouse, and 'increased grazing' (the opposite conversion) the most detrimental. Positioning of new landscape blocks was shown to be important in some situations. Increasing the area of opencanopy forestry caused a proportional decrease in suitability, but suitability gains for the 'reduced grazing' scenario were nonlinear. 'Scenario-led' landscape simulation models can be applied in assessments of the impacts of land use change both on individual species and also on diversity and community measures, or ecosystem services. A next step would be to include landscape configuration more explicitly in the simulation models, both to make them more realistic, and to examine the effects of habitat placement more thoroughly. In this example, the recommended policy would be incentives on grazing reduction to benefit black grouse

    Corrigendum to ‘An international genome-wide meta-analysis of primary biliary cholangitis: Novel risk loci and candidate drugs’ [J Hepatol 2021;75(3):572–581]

    Get PDF
    corecore