805 research outputs found

    Pandemic Influenza and Covid-19: Geographical Velocity and Control

    Get PDF

    Spatial Growth Rate of Emerging SARS-CoV-2 Lineages in England, September 2020-December 2021

    Get PDF
    This paper uses a robust method of spatial epidemiological analysis to assess the spatial growth rate of multiple lineages of SARS-CoV-2 in the local authority areas of England, September 2020-December 2021. Using the genomic surveillance records of the COVID-19 Genomics UK (COG-UK) Consortium, the analysis identifies a substantial (7.6-fold) difference in the average rate of spatial growth of 37 sample lineages, from the slowest (Delta AY.4.3) to the fastest (Omicron BA.1). Spatial growth of the Omicron (B.1.1.529 and BA) variant was found to be 2.81 Ă— faster than the Delta (B.1.617.2 and AY) variant and 3.76Ă— faster than the Alpha (B.1.1.7 and Q) variant. In addition to AY.4.2 (a designated variant under investigation, VUI-21OCT-01), three Delta sublineages (AY.43, AY.98 and AY.120) were found to display a statistically faster rate of spatial growth than the parent lineage and would seem to merit further investigation. We suggest that the monitoring of spatial growth rates is a potentially valuable adjunct to outbreak response procedures for emerging SARS-CoV-2 variants in a defined population

    Experimental analysis of the effect of taxes and subsides on calories purchased in an on-line supermarket

    Get PDF
    Taxes and subsidies are a public health approach to improving nutrient quality of food purchases. While taxes or subsidies influence purchasing, it is unclear whether they influence total energy or overall diet quality of foods purchased. Using a within subjects design, selected low nutrient dense foods (e.g. sweetened beverages, candy, salty snacks) were taxed, and fruits and vegetables and bottled water were subsidized by 12.5% or 25% in comparison to a usual price condition for 199 female shoppers in an experimental store. Results showed taxes reduced calories purchased of taxed foods (coefficient = -6.61, Cl = -11.94 to -1.28) and subsidies increased calories purchased of subsidized foods (coefficient = 13.74, Cl = 8.51 to 18.97). However, no overall effect was observed on total calories purchased. Both taxes and subsidies were associated with a reduction in calories purchased for grains (taxes: coefficient = -6.58, Cl = -11.91 to -1.24, subsidies: coefficient = -12.86, Cl = -18.08 to -7.63) and subsidies were associated with a reduction in calories purchased for miscellaneous foods (coefficient = -7.40, CI = -12.62 to -2.17) (mostly fats, oils and sugars). Subsidies improved the nutrient quality of foods purchased (coefficient = 0.14, Cl = 0.07 to 0.21). These results suggest that taxes and subsidies can influence energy purchased for products taxed or subsidized, but not total energy purchased. However, the improvement in nutrient quality with subsidies indicates that pricing can shift nutritional quality of foods purchased. Research is needed to evaluate if differential pricing strategies based on nutrient quality are associated with reduction in calories and improvement in nutrient quality of foods purchased

    Abrupt transition to heightened poliomyelitis epidemicity in England and Wales, 1947–1957, associated with a pronounced increase in the geographical rate of disease propagation

    Get PDF
    The abrupt transition to heightened poliomyelitis epidemicity in England and Wales, 1947–1957, was associated with a profound change in the spatial dynamics of the disease. Drawing on the complete record of poliomyelitis notifications in England and Wales, we use a robust method of spatial epidemiological analysis (swash-backwash model) to evaluate the geographical rate of disease propagation in successive poliomyelitis seasons, 1940–1964. Comparisons with earlier and later time periods show that the period of heightened poliomyelitis epidemicity corresponded with a sudden and pronounced increase in the spatial rate of disease propagation. This change was observed for both urban and rural areas and points to an abrupt enhancement in the propensity for the geographical spread of polioviruses. Competing theories of the epidemic emergence of poliomyelitis in England and Wales should be assessed in the light of this evidence

    Reexamining evidence-based practice in community corrections: beyond 'a confined view' of what works

    Get PDF
    This article aims to reexamine the development and scope of evidence-based practice (EBP) in community corrections by exploring three sets of issues. Firstly, we examine the relationships between the contested purposes of community supervision and their relationships to questions of evidence. Secondly, we explore the range of forms of evidence that might inform the pursuit of one purpose of supervision—the rehabilitation of offenders—making the case for a fuller engagement with “desistance” research in supporting this process. Thirdly, we examine who can and should be involved in conversations about EBP, arguing that both ex/offenders’ and practitioners’ voices need to be respected and heard in this debate

    Improving recruitment to a study of telehealth management for long-term conditions in primary care: two embedded, randomised controlled trials of optimised patient information materials

    Get PDF
    Background: Patient understanding of study information is fundamental to gaining informed consent to take part in a randomised controlled trial. In order to meet the requirements of research ethics committees, patient information materials can be long and need to communicate complex messages. There is concern that standard approaches to providing patient information may deter potential participants from taking part in trials. The Systematic Techniques for Assisting Recruitment to Trials (MRC-START) research programme aims to test interventions to improve trial recruitment. The aim of this study was to investigate the effect on recruitment of optimised patient information materials (with improved readability and ease of comprehension) compared with standard materials. The study was embedded within two primary care trials involving patients with long-term conditions. Methods: The Healthlines Study involves two linked trials evaluating a telehealth intervention in patients with depression (Healthlines Depression) or raised cardiovascular disease risk (Healthlines CVD). We conducted two trials of a recruitment intervention, embedded within the Healthlines host trials. Patients identified as potentially eligible in each of the Healthlines trials were randomised to receive either the original patient information materials or optimised versions of these materials. Primary outcomes were the proportion of participants randomised (Healthlines Depression) and the proportion expressing interest in taking part (Healthlines CVD). Results: In Healthlines Depression (n = 1364), 6.3 % of patients receiving the optimised patient information materials were randomised into the study compared to 4.0 % in those receiving standard materials (OR = 1.63, 95 % CI = 1.00 to 2.67). In Healthlines CVD (n = 671) 24.0 % of those receiving optimised patient information materials responded positively to the invitation to participate, compared to 21.9 % in those receiving standard materials (OR = 1.12, 95 % CI = 0.78 to 1.61). Conclusions: Evidence from these two embedded trials suggests limited benefits of optimised patient information materials on recruitment rates, which may only be apparent in some patient populations, with no effects on other outcomes. Further embedded trials are needed to provide a more precise estimate of effect, and to explore further how effects vary by trial context, intervention, and patient population

    The Debrisoft ® monofilament debridement pad for use in acute or chronic wounds: A NICE medical technology guidance

    Get PDF
    As part of its Medical Technology Evaluation Programme, the National Institute for Health and Care Excellence (NICE) invited a manufacturer to provide clinical and economic evidence for the evaluation of the Debrisoft ® monofilament debridement pad for use in acute or chronic wounds. The University of Birmingham and Brunel University, acting as a consortium, was commissioned to act as an External Assessment Centre (EAC) for NICE, independently appraising the submission. This article is an overview of the original evidence submitted, the EAC’s findings and the final NICE guidance issued. The sponsor submitted a simple cost analysis to estimate the costs of using Debrisoft® to debride wounds compared with saline and gauze, hydrogel and larvae. Separate analyses were conducted for applications in home and applications in a clinic setting. The analysis took an UK National Health Service (NHS) perspective. It incorporated the costs of the technologies and supplementary technologies (such as dressings) and the costs of their application by a district nurse. The sponsor concluded that Debrisoft® was cost saving relative to the comparators. The EAC made amendments to the sponsor analysis to correct for errors and to reflect alternative assumptions. Debrisoft® remained cost saving in most analyses and savings ranged from £77 to £222 per patient compared with hydrogel, from £97 to £347 compared with saline and gauze, and from £180 to £484 compared with larvae depending on the assumptions included in the analysis and whether debridement took place in a home or clinic setting. All analyses were severely limited by the available data on effectiveness, in particular a lack of comparative studies and that the effectiveness data for the comparators came from studies reporting different clinical endpoints compared with Debrisoft®. The Medical Technologies Advisory Committee made a positive recommendation for adoption of Debrisoft® and this has been published as a NICE medical technology guidance (MTG17).The Birmingham and Brunel Consortium is funded by NICE to act as an External Assessment Centre for the Medical Technologies Evaluation Programme

    Regulation of hepatic cytochrome P450 expression in mice with intestinal or systemic infections of citrobacter rodentium.

    Get PDF
    We reported previously that infection of C3H/HeOuJ (HeOu) mice with the murine intestinal pathogen Citrobacter rodentium caused a selective modulation of hepatic cytochrome P450 (P450) gene expression in the liver that was independent of the Toll-like receptor 4. However, HeOu mice are much more sensitive to the pathogenic effects of C. rodentium infection, and the P450 down-regulation was associated with significant morbidity in the animals. Here, we report that oral infection of C57BL/6 mice with C. rodentium, which produced only mild clinical signs and symptoms, produced very similar effects on hepatic P450 expression in this strain. As in HeOu mice, CYP4A mRNAs and proteins were among the most sensitive to down-regulation, whereas CYP4F18 was induced. CYP2D9 mRNA was also induced 8- to 9-fold in the C57BL/6 mice. The time course of P450 regulation followed that of colonic inflammation and bacterial colonization, peaking at 7 to 10 days after infection and returning to normal at 15 to 24 days as the infection resolved. These changes also correlated with the time course of significant elevations in the serum of the proinflammatory cytokines interleukin (IL)-6 and tumor necrosis factor-alpha, as well as of interferon-gamma and IL-2, with serum levels of IL-6 being markedly higher than those of the other cytokines. Intraperitoneal administration of C. rodentium produced a rapid down-regulation of P450 enzymes that was quantitatively and qualitatively different from that of oral infection, although CYP2D9 was induced in both models, suggesting that the effects of oral infection on the liver are not due to bacterial translocation

    Epithelium-off corneal cross-linking surgery compared with standard care in 10- to 16-year-olds with progressive keratoconus: the KERALINK RCT

    Get PDF
    Background: Keratoconus is a disease of the cornea affecting vision that is usually first diagnosed in the first three decades. The abnormality of corneal shape and thickness tends to progress until the patient reaches approximately 30 years of age. Epithelium-off corneal cross-linking is a procedure that has been demonstrated to be effective in randomised trials in adults and observational studies in young patients. // Objectives: The KERALINK trial examined the efficacy and safety of epithelium-off corneal cross-linking, compared with standard care by spectacle or contact lens correction, for stabilisation of progressive keratoconus. // Design: In this observer-masked, randomised, controlled, parallel-group superiority trial, 60 participants aged 10–16 years with progressive keratoconus were randomised; 58 participants completed the study. Progression was defined as a 1.5 D increase in corneal power measured by maximum or mean power (K2) in the steepest corneal meridian in the study eye, measured by corneal tomography. // Setting: Referral clinics in four UK hospitals. // Interventions: Participants were randomised to corneal cross-linking plus standard care or standard care alone, with spectacle or contact lens correction as necessary for vision, and were monitored for 18 months. // Main outcome measures: The primary outcome was K2 in the study eye as a measure of the steepness of the cornea at 18 months post randomisation. Secondary outcomes included keratoconus progression, visual acuity, keratoconus apex corneal thickness and quality of life. // Results: Of 60 participants, 30 were randomised to the corneal cross-linking and standard-care groups. Of these, 30 patients in the corneal cross-linking group and 28 patients in the standard-care group were analysed. The mean (standard deviation) K2 in the study eye at 18 months post randomisation was 49.7 D (3.8 D) in the corneal cross-linking group and 53.4 D (5.8 D) in the standard-care group. The adjusted mean difference in K2 in the study eye was –3.0 D (95% confidence interval –4.93 D to –1.08 D; p = 0.002), favouring corneal cross-linking. Uncorrected and corrected differences in logMAR vision at 18 months were better in eyes receiving corneal cross-linking: –0.31 (95% confidence interval –0.50 to –0.11; p = 0.002) and –0.30 (95% confidence interval –0.48 to –0.11; p = 0.002). Keratoconus progression in the study eye occurred in two patients (7%) randomised to corneal cross-linking compared with 12 (43%) patients randomised to standard care. The unadjusted odds ratio suggests that, on average, patients in the corneal cross-linking group had 90% (odds ratio 0.1, 95% confidence interval 0.02 to 0.48; p = 0.004) lower odds of experiencing progression than those receiving standard care. Quality-of-life outcomes were similar in both groups. No adverse events were attributable to corneal cross-linking. // Limitations: Measurements of K2 in those eyes with the most significant progression were in some cases indicated as suspect by corneal topography device software. // Conclusions: Corneal cross-linking arrests progression of keratoconus in the great majority of young patients. These data support a consideration of a change in practice, such that corneal cross-linking could be considered as first-line treatment in progressive disease. If the arrest of keratoconus progression induced by corneal cross-linking is sustained in longer follow-up, there may be particular benefit in avoiding the later requirement for contact lens wear or corneal transplantation. However, keratoconus does not continue to progress in all patients receiving standard care. For future work, the most important questions to be answered are whether or not (1) the arrest of keratoconus progression induced by corneal cross-linking is maintained in the long term and (2) the proportion of those receiving standard care who show significant progression increases with time
    • …
    corecore