20 research outputs found

    Costā€“benefit analysis of surveillance for surgical site infection following caesarean section

    Get PDF
    Objective To estimate the economic burden to the health service of surgical site infection following caesarean section and to identify potential savings achievable through implementation of a surveillance programme. Design Economic model to evaluate the costs and benefits of surveillance from community and hospital healthcare providersā€™ perspective. Setting England. Participants Women undergoing caesarean section in National Health Service hospitals. Main outcome measure Costs attributable to treatment and management of surgical site infection following caesarean section. Results The costs (2010) for a hospital carrying out 800 caesarean sections a year based on infection risk of 9.6% were estimated at Ā£18,914 (95% CI 11,521 to 29,499) with 28% accounted for by community care (Ā£5,370). With inflation to 2019 prices, this equates to an estimated cost of Ā£5.0m for all caesarean sections performed annually in England 2018-19, approximately Ā£1,866 and Ā£93 per infection managed in hospital and community respectively. The cost of surveillance for a hospital for one calendar quarter was estimated as Ā£3,747 (2010 costs). Modelling a decrease in risk of infection of 30, 20 or 10% between successive surveillance periods indicated that a variable intermittent surveillance strategy achieved higher or similar net savings than continuous surveillance. Breakeven was reached sooner with the variable surveillance strategy than continuous surveillance when the baseline risk of infection was 10 or 15% and smaller loses with a baseline risk of 5%. Conclusion Surveillance of surgical site infections after caesarean section with feedback of data to surgical teams offers a potentially effective means to reduce infection risk, improve patient experience and save money for the health service. Strengths and limitations ā€¢ The model estimated both community (28%) and hospital costs (72%), providing a more representative estimate of overall economic burden to the health service. ā€¢ Time-matching of patients with and without infection according to length of post-operative stay provided a more accurate assessment of excess bed-days attributable to surgical site infection (2.6 days) than average excess length of stay (median difference 5 days) comparison by disentangling the impact of prolonged length of stay on increased chance of detecting an infection. ā€¢ Through capture and assessment of the costs and impact of surveillance, our model demonstrated the potential for savings through reductions in incidence of surgical site infections. ā€¢ Costs were obtained from NHS National Schedule Reference Costs and other sources rather than observed expenditure and assumptions made about the number of extra midwife and general practitioner appointments resulting from infection. ā€¢ The study was based on healthcare utilisation and did not assess direct and indirect costs borne by the patients or their carers

    Planning Framework Options for The Massachusetts Ocean Plan (DRAFT)

    Get PDF
    The Massachusetts Ocean Partnership (MOP) Planning Frameworks Team, in consultation with the Massachusetts Executive Office of Energy and Environmental Affairs (EEA), and based on collective experience and a review of ocean, coastal and resource management programs from the US and other countries, suggests that nine elements are essential components of the framework for the Massachusetts Ocean Plan and its implementation. While management plans and programs generally have these elements in common, there are a range of options for carrying out each program component. These options were presented to structure and inform the development of the Massachusetts Ocean Plan. For the most part, the range of options represents those that were considered to be appropriate under the Commonwealthā€™s existing legal and administrative structure and responsive to the requirements of the Massachusetts Ocean Act. However, the general concepts these options represent are likely to be transferable to other jurisdictions (especially in the United States) and can inform future ocean management and planning in Massachusetts. Additionally, options or their core elements can be combined to create additional alternatives within one of the nine planning components

    Long-Term Continental Changes in Wing Length, but Not Bill Length, of a Long-Distance Migratory Shorebird

    Get PDF
    We compiled a >50ā€year record of morphometrics for semipalmated sandpipers (Calidris pusilla), a shorebird species with a Nearctic breeding distribution and intercontinental migration to South America. Our data included >57,000 individuals captured 1972ā€“2015 at five breeding locations and three major stopover sites, plus 139 museum specimens collected in earlier decades. Wing length increased by ca. 1.5 mm (>1%) prior to 1980, followed by a decrease of 3.85 mm (nearly 4%) over the subsequent 35 years. This can account for previously reported changes in metrics at a migratory stopover site from 1985 to 2006. Wing length decreased at a rate of 1,098 darwins, or 0.176 haldanes, within the ranges of other field studies of phenotypic change. Bill length, in contrast, showed no consistent change over the full period of our study. Decreased body size as a universal response of animal populations to climate warming, and several other potential mechanisms, are unable to account for the increasing and decreasing wing length pattern observed. We propose that the postā€WWII nearā€extirpation of falcon populations and their postā€1973 recovery driven by the widespread use and subsequent limitation on DDT in North America selected initially for greater flight efficiency and latterly for greater agility. This predation danger hypothesis accounts for many features of the morphometric data and deserves further investigation in this and other species

    Dipeptidyl peptidase-1 inhibition in patients hospitalised with COVID-19: a multicentre, double-blind, randomised, parallel-group, placebo-controlled trial

    Get PDF
    Background Neutrophil serine proteases are involved in the pathogenesis of COVID-19 and increased serine protease activity has been reported in severe and fatal infection. We investigated whether brensocatib, an inhibitor of dipeptidyl peptidase-1 (DPP-1; an enzyme responsible for the activation of neutrophil serine proteases), would improve outcomes in patients hospitalised with COVID-19. Methods In a multicentre, double-blind, randomised, parallel-group, placebo-controlled trial, across 14 hospitals in the UK, patients aged 16 years and older who were hospitalised with COVID-19 and had at least one risk factor for severe disease were randomly assigned 1:1, within 96 h of hospital admission, to once-daily brensocatib 25 mg or placebo orally for 28 days. Patients were randomly assigned via a central web-based randomisation system (TruST). Randomisation was stratified by site and age (65 years or ā‰„65 years), and within each stratum, blocks were of random sizes of two, four, or six patients. Participants in both groups continued to receive other therapies required to manage their condition. Participants, study staff, and investigators were masked to the study assignment. The primary outcome was the 7-point WHO ordinal scale for clinical status at day 29 after random assignment. The intention-to-treat population included all patients who were randomly assigned and met the enrolment criteria. The safety population included all participants who received at least one dose of study medication. This study was registered with the ISRCTN registry, ISRCTN30564012. Findings Between June 5, 2020, and Jan 25, 2021, 406 patients were randomly assigned to brensocatib or placebo; 192 (47Ā·3%) to the brensocatib group and 214 (52Ā·7%) to the placebo group. Two participants were excluded after being randomly assigned in the brensocatib group (214 patients included in the placebo group and 190 included in the brensocatib group in the intention-to-treat population). Primary outcome data was unavailable for six patients (three in the brensocatib group and three in the placebo group). Patients in the brensocatib group had worse clinical status at day 29 after being randomly assigned than those in the placebo group (adjusted odds ratio 0Ā·72 [95% CI 0Ā·57ā€“0Ā·92]). Prespecified subgroup analyses of the primary outcome supported the primary results. 185 participants reported at least one adverse event; 99 (46%) in the placebo group and 86 (45%) in the brensocatib group. The most common adverse events were gastrointestinal disorders and infections. One death in the placebo group was judged as possibly related to study drug. Interpretation Brensocatib treatment did not improve clinical status at day 29 in patients hospitalised with COVID-19

    Incorporating Climate Uncertainty into Conservation Planning for Wildlife Managers

    No full text
    The U.S. Fish and Wildlife Service (USFWS) is one of the oldest conservation organizations in the United States and is the only federal agency solely charged with conserving fish, wildlife, plants and their habitats. The agency leads numerous conservation initiatives, such as protecting and recovering endangered species, managing almost 600 wildlife refuges throughout all states and territories, enforcing federal wildlife laws, and regulating international wildlife trade. In the past, these activities have not accounted for climate change. The accelerating biodiversity crisis, in combination with climate uncertainty, adds to the existing complexity associated with responding to multiple anthropogenic stressors. Here we describe current practice and thinking related to climate uncertainty and management of USFWS resources. We focus on three agency domains which represent various conservation planning responsibilities: evaluating species to be listed as threatened or endangered, Habitat Conservation Plans for listed species, and land management techniques on wildlife refuges. Integrating climate considerations into agency planning documents is complex and we highlight effective current applications and suggest future improvements. Additionally, we identify outstanding research needs or management applications, and updates to existing policy that will aid in developing improved conservation strategies. Our synthesis contributes to ongoing efforts to incorporate climate uncertainty into conservation planning, natural resource management, and related policy revisions

    Reply to Muzzioli etĀ al. : Communicating nutrition and environmental information to food system stakeholders

    Get PDF
    This article replies to - Letter, April 17, 2023, How to communicate the healthiness and sustainability of foods to consumers? Luca Muzzioli, Eleonora Poggiogalle [...] Alessandro PintoNon peer reviewedPublisher PD

    Epigenetic age acceleration as a biomarker for impaired cognitive abilities in adulthood following early life adversity and psychiatric disorders

    No full text
    Background: Early life adversity and psychiatric disorders are associated with earlier declines in neurocognitive abilities during adulthood. These declines may be preceded by changes in biological aging, specifically epigenetic age acceleration, providing an opportunity to uncover genome-wide biomarkers that identify individuals most likely to benefit from early screening and prevention. Methods: Five unique epigenetic age acceleration clocks derived from peripheral blood were examined in relation to latent variables of general and speeded cognitive abilities across two independent cohorts: 1) the Female Growth and Development Study (FGDS; nĀ =Ā 86), a 30-year prospective cohort study of substantiated child sexual abuse and non-abused controls, and 2) the Biological Classification of Mental Disorders study (BeCOME; nĀ =Ā 313), an adult community cohort established based on psychiatric disorders. Results: A faster pace of biological aging (DunedinPoAm) was associated with lower general cognitive abilities in both cohorts and slower speeded abilities in the BeCOME cohort. Acceleration in the Horvath clock was significantly associated with slower speeded abilities in the BeCOME cohort but not the FGDS. Acceleration in the Hannum clock and the GrimAge clock were not significantly associated with either cognitive ability. Accelerated PhenoAge was associated with slower speeded abilities in the FGDS but not the BeCOME cohort. Conclusions: The present results suggest that epigenetic age acceleration has the potential to serve as a biomarker for neurocognitive decline in adults with a history of early life adversity or psychiatric disorders. Estimates of epigenetic aging may identify adults at risk of cognitive decline that could benefit from early neurocognitive screening
    corecore