23 research outputs found

    Achieving biodiversity net gain by addressing governance gaps underpinning ecological compensation policies

    Get PDF
    Biodiversity compensation policies have emerged around the world to address the ecological harms of infrastructure expansion, but historically compliance is weak. The Westminster government is introducing a requirement that new infrastructure developments in England demonstrate they achieve a biodiversity net gain (BNG). We sought to determine the magnitude of the effects of governance gaps and regulator capacity constraints on the policy's potential biodiversity impacts. We collated BNG information from all new major developments across six early‐adopter councils from 2020 to 2022. We quantified the proportion of the biodiversity outcomes promised under BNG at risk of noncompliance, explored the variation in strategies used to meet developers’ biodiversity liabilities, and quantified the occurrence of simple errors in the biodiversity metric calculations. For large developments and energy infrastructure, biodiversity liabilities frequently met within the projects’ development footprint. For small developments, the purchase of offsets was most common. We estimated that 27% of all biodiversity units fell into governance gaps that exposed them to a high risk of noncompliance because they were associated with better‐condition habitats delivered on‐site that were unlikely to be monitored or enforced. More robust governance mechanisms (e.g., practical mechanisms for monitoring and enforcement) would help ensure the delivery of this biodiversity on‐site. Alternatively, more biodiversity gains could be delivered through off‐site biodiversity offsetting. For the latter case, we estimated that the demand for offsets could rise by a factor of 4; this would substantially increase the financial contributions from developers for conservation activities on private land. Twenty‐one percent of development applications contained a simple recurring error in their BNG calculations. One‐half of these applications were approved by councils, which may indicate under‐resourcing in council development assessments. Our findings demonstrate that resourcing and governance shortfalls risk undermining the policy's effectiveness

    The Subaru/XMM-Newton Deep Survey (SXDS) - V. Optically Faint Variable Object Survey

    Get PDF
    We present our survey for optically faint variable objects using multi-epoch (8-10 epochs over 2-4 years) iâ€Či'-band imaging data obtained with Subaru Suprime-Cam over 0.918 deg2^2 in the Subaru/XMM-Newton Deep Field (SXDF). We found 1040 optically variable objects by image subtraction for all the combinations of images at different epochs. This is the first statistical sample of variable objects at depths achieved with 8-10m class telescopes or HST. The detection limit for variable components is ivariâ€Č∌25.5i'_{\rm{vari}}\sim25.5 mag. These variable objects were classified into variable stars, supernovae (SNe), and active galactic nuclei (AGN), based on the optical morphologies, magnitudes, colors, and optical-mid-infrared colors of the host objects, spatial offsets of variable components from the host objects, and light curves. Detection completeness was examined by simulating light curves for periodic and irregular variability. We detected optical variability for 36±236\pm2% (51±351\pm3% for a bright sample with iâ€Č<24.4i'<24.4 mag) of X-ray sources in the field. Number densities of variable obejcts as functions of time intervals Δt\Delta{t} and variable component magnitudes ivariâ€Či'_{\rm{vari}} are obtained. Number densities of variable stars, SNe, and AGN are 120, 489, and 579 objects deg−2^{-2}, respectively. Bimodal distributions of variable stars in the color-magnitude diagrams indicate that the variable star sample consists of bright (V∌22V\sim22 mag) blue variable stars of the halo population and faint (V∌23.5V\sim23.5 mag) red variable stars of the disk population. There are a few candidates of RR Lyrae providing a possible number density of ∌10−2\sim10^{-2} kpc−3^{-3} at a distance of >150>150 kpc from the Galactic center.Comment: 18 pages, 17 figures, accepted for publication in Ap

    Trends, patterns and psychological influences on COVID-19 vaccination intention: Findings from a large prospective community cohort study in England and Wales (Virus Watch).

    Get PDF
    BACKGROUND: Vaccination intention is key to the success of any vaccination programme, alongside vaccine availability and access. Public intention to take a COVID-19 vaccine is high in England and Wales compared to other countries, but vaccination rate disparities between ethnic, social and age groups has led to concern. METHODS: Online survey of prospective household community cohort study participants across England and Wales (Virus Watch). Vaccination intention was measured by individual participant responses to 'Would you accept a COVID-19 vaccine if offered?', collected in December 2020 and February 2021. Responses to a 13-item questionnaire collected in January 2021 were analysed using factor analysis to investigate psychological influences on vaccination intention. RESULTS: Survey response rate was 56% (20,785/36,998) in December 2020 and 53% (20,590/38,727) in February 2021, with 14,880 adults reporting across both time points. In December 2020, 1,469 (10%) participants responded 'No' or 'Unsure'. Of these people, 1,266 (86%) changed their mind and responded 'Yes' or 'Already had a COVID-19 vaccine' by February 2021. Vaccination intention increased across all ethnic groups and levels of social deprivation. Age was most strongly associated with vaccination intention, with 16-24-year-olds more likely to respond "Unsure" or "No" versus "Yes" than 65-74-year-olds in December 2020 (OR: 4.63, 95 %CI: 3.42, 6.27 & OR 7.17 95 %CI: 4.26, 12.07 respectively) and February 2021 (OR: 27.92 95 %CI: 13.79, 56.51 & OR 17.16 95 %CI: 4.12, 71.55). The association between ethnicity and vaccination intention weakened, but did not disappear, over time. Both vaccine- and illness-related psychological factors were shown to influence vaccination intention. CONCLUSIONS: Four in five adults (86%) who were reluctant or intending to refuse a COVID-19 vaccine in December 2020 had changed their mind in February 2021 and planned to accept, or had already accepted, a vaccine

    Climate change drives migratory range shift via individual plasticity in shearwaters.

    Get PDF
    How individual animals respond to climate change is key to whether populations will persist or go extinct. Yet, few studies investigate how changes in individual behavior underpin these population-level phenomena. Shifts in the distributions of migratory animals can occur through adaptation in migratory behaviors, but there is little understanding of how selection and plasticity contribute to population range shift. Here, we use long-term geolocator tracking of Balearic shearwaters (Puffinus mauretanicus) to investigate how year-to-year changes in individual birds' migrations underpin a range shift in the post-breeding migration. We demonstrate a northward shift in the post-breeding range and show that this is brought about by individual plasticity in migratory destination, with individuals migrating further north in response to changes in sea-surface temperature. Furthermore, we find that when individuals migrate further, they return faster, perhaps minimizing delays in return to the breeding area. Birds apparently judge the increased distance that they will need to migrate via memory of the migration route, suggesting that spatial cognitive mechanisms may contribute to this plasticity and the resulting range shift. Our study exemplifies the role that individual behavior plays in populations' responses to environmental change and highlights some of the behavioral mechanisms that might be key to understanding and predicting species persistence in response to climate change

    Fludarabine, cytarabine, granulocyte colony-stimulating factor, and idarubicin with gemtuzumab ozogamicin improves event-free survival in younger patients with newly diagnosed aml and overall survival in patients with npm1 and flt3 mutations

    Get PDF
    Purpose To determine the optimal induction chemotherapy regimen for younger adults with newly diagnosed AML without known adverse risk cytogenetics. Patients and Methods One thousand thirty-three patients were randomly assigned to intensified (fludarabine, cytarabine, granulocyte colony-stimulating factor, and idarubicin [FLAG-Ida]) or standard (daunorubicin and Ara-C [DA]) induction chemotherapy, with one or two doses of gemtuzumab ozogamicin (GO). The primary end point was overall survival (OS). Results There was no difference in remission rate after two courses between FLAG-Ida + GO and DA + GO (complete remission [CR] + CR with incomplete hematologic recovery 93% v 91%) or in day 60 mortality (4.3% v 4.6%). There was no difference in OS (66% v 63%; P = .41); however, the risk of relapse was lower with FLAG-Ida + GO (24% v 41%; P < .001) and 3-year event-free survival was higher (57% v 45%; P < .001). In patients with an NPM1 mutation (30%), 3-year OS was significantly higher with FLAG-Ida + GO (82% v 64%; P = .005). NPM1 measurable residual disease (MRD) clearance was also greater, with 88% versus 77% becoming MRD-negative in peripheral blood after cycle 2 (P = .02). Three-year OS was also higher in patients with a FLT3 mutation (64% v 54%; P = .047). Fewer transplants were performed in patients receiving FLAG-Ida + GO (238 v 278; P = .02). There was no difference in outcome according to the number of GO doses, although NPM1 MRD clearance was higher with two doses in the DA arm. Patients with core binding factor AML treated with DA and one dose of GO had a 3-year OS of 96% with no survival benefit from FLAG-Ida + GO. Conclusion Overall, FLAG-Ida + GO significantly reduced relapse without improving OS. However, exploratory analyses show that patients with NPM1 and FLT3 mutations had substantial improvements in OS. By contrast, in patients with core binding factor AML, outcomes were excellent with DA + GO with no FLAG-Ida benefit

    PANC Study (Pancreatitis: A National Cohort Study): national cohort study examining the first 30 days from presentation of acute pancreatitis in the UK

    Get PDF
    Abstract Background Acute pancreatitis is a common, yet complex, emergency surgical presentation. Multiple guidelines exist and management can vary significantly. The aim of this first UK, multicentre, prospective cohort study was to assess the variation in management of acute pancreatitis to guide resource planning and optimize treatment. Methods All patients aged greater than or equal to 18 years presenting with acute pancreatitis, as per the Atlanta criteria, from March to April 2021 were eligible for inclusion and followed up for 30 days. Anonymized data were uploaded to a secure electronic database in line with local governance approvals. Results A total of 113 hospitals contributed data on 2580 patients, with an equal sex distribution and a mean age of 57 years. The aetiology was gallstones in 50.6 per cent, with idiopathic the next most common (22.4 per cent). In addition to the 7.6 per cent with a diagnosis of chronic pancreatitis, 20.1 per cent of patients had a previous episode of acute pancreatitis. One in 20 patients were classed as having severe pancreatitis, as per the Atlanta criteria. The overall mortality rate was 2.3 per cent at 30 days, but rose to one in three in the severe group. Predictors of death included male sex, increased age, and frailty; previous acute pancreatitis and gallstones as aetiologies were protective. Smoking status and body mass index did not affect death. Conclusion Most patients presenting with acute pancreatitis have a mild, self-limiting disease. Rates of patients with idiopathic pancreatitis are high. Recurrent attacks of pancreatitis are common, but are likely to have reduced risk of death on subsequent admissions. </jats:sec

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Improving the ecological outcomes of compensatory conservation by addressing governance gaps: a case study of Biodiversity Net Gain in England

    No full text
    Biodiversity compensation policies have emerged around the world to address the ecological harms of infrastructure expansion, but they have historically experienced weak compliance. The English government is introducing a requirement that all new infrastructure developments demonstrate they achieve a Biodiversity Net Gain (BNG). Previous research has highlighted governance gaps that risk undermining the policy’s ecological outcomes, as well as exploring the risks caused by fundamental capacity constraints in regulators, but the magnitude of their effects on the policy’s potential impacts on biodiversity remains unexplored. We collated BNG information from all new major developments across six early adopter councils from 2020-2022. We quantified the proportion of the biodiversity outcomes promised under BNG which are at risk of non-compliance, explored the variation in strategies that developments use to meet their biodiversity liabilities, and quantified the occurrence of simple errors in the biodiversity metric calculations submitted by project proponents. Large developments and energy infrastructure were more likely to meet their liability within their own development footprint, and small developments more likely to purchase offsets. We estimate that 27% of all biodiversity units fall within governance gaps that expose them to a high risk of non-compliance. Should these units instead be delivered through the off-site biodiversity offsetting market – which is associated with relatively more robust governance mechanisms – we estimate that the demand for offsets could rise by a factor of four, increasing the financial contributions generated by BNG for conservation activities on private land. Lastly, we find that 21% of applications contained conspicuous errors in their BNG calculations, half of which have already been accepted by councils, hinting at under-resourcing in councils assessing developments. Our findings demonstrate that resourcing and governance shortfalls risk undermining the policy’s effectiveness at halting biodiversity loss and require addressing to ensure the policy benefits nature
    corecore