23 research outputs found
Achieving biodiversity net gain by addressing governance gaps underpinning ecological compensation policies
Biodiversity compensation policies have emerged around the world to address the ecological harms of infrastructure expansion, but historically compliance is weak. The Westminster government is introducing a requirement that new infrastructure developments in England demonstrate they achieve a biodiversity net gain (BNG). We sought to determine the magnitude of the effects of governance gaps and regulator capacity constraints on the policy's potential biodiversity impacts. We collated BNG information from all new major developments across six earlyâadopter councils from 2020 to 2022. We quantified the proportion of the biodiversity outcomes promised under BNG at risk of noncompliance, explored the variation in strategies used to meet developersâ biodiversity liabilities, and quantified the occurrence of simple errors in the biodiversity metric calculations. For large developments and energy infrastructure, biodiversity liabilities frequently met within the projectsâ development footprint. For small developments, the purchase of offsets was most common. We estimated that 27% of all biodiversity units fell into governance gaps that exposed them to a high risk of noncompliance because they were associated with betterâcondition habitats delivered onâsite that were unlikely to be monitored or enforced. More robust governance mechanisms (e.g., practical mechanisms for monitoring and enforcement) would help ensure the delivery of this biodiversity onâsite. Alternatively, more biodiversity gains could be delivered through offâsite biodiversity offsetting. For the latter case, we estimated that the demand for offsets could rise by a factor of 4; this would substantially increase the financial contributions from developers for conservation activities on private land. Twentyâone percent of development applications contained a simple recurring error in their BNG calculations. Oneâhalf of these applications were approved by councils, which may indicate underâresourcing in council development assessments. Our findings demonstrate that resourcing and governance shortfalls risk undermining the policy's effectiveness
The Subaru/XMM-Newton Deep Survey (SXDS) - V. Optically Faint Variable Object Survey
We present our survey for optically faint variable objects using multi-epoch
(8-10 epochs over 2-4 years) -band imaging data obtained with Subaru
Suprime-Cam over 0.918 deg in the Subaru/XMM-Newton Deep Field (SXDF). We
found 1040 optically variable objects by image subtraction for all the
combinations of images at different epochs. This is the first statistical
sample of variable objects at depths achieved with 8-10m class telescopes or
HST. The detection limit for variable components is
mag. These variable objects were classified into variable stars, supernovae
(SNe), and active galactic nuclei (AGN), based on the optical morphologies,
magnitudes, colors, and optical-mid-infrared colors of the host objects,
spatial offsets of variable components from the host objects, and light curves.
Detection completeness was examined by simulating light curves for periodic and
irregular variability. We detected optical variability for (
for a bright sample with mag) of X-ray sources in the field. Number
densities of variable obejcts as functions of time intervals and
variable component magnitudes are obtained. Number densities
of variable stars, SNe, and AGN are 120, 489, and 579 objects deg,
respectively. Bimodal distributions of variable stars in the color-magnitude
diagrams indicate that the variable star sample consists of bright (
mag) blue variable stars of the halo population and faint ( mag) red
variable stars of the disk population. There are a few candidates of RR Lyrae
providing a possible number density of kpc at a distance
of kpc from the Galactic center.Comment: 18 pages, 17 figures, accepted for publication in Ap
Trends, patterns and psychological influences on COVID-19 vaccination intention: Findings from a large prospective community cohort study in England and Wales (Virus Watch).
BACKGROUND: Vaccination intention is key to the success of any vaccination programme, alongside vaccine availability and access. Public intention to take a COVID-19 vaccine is high in England and Wales compared to other countries, but vaccination rate disparities between ethnic, social and age groups has led to concern. METHODS: Online survey of prospective household community cohort study participants across England and Wales (Virus Watch). Vaccination intention was measured by individual participant responses to 'Would you accept a COVID-19 vaccine if offered?', collected in December 2020 and February 2021. Responses to a 13-item questionnaire collected in January 2021 were analysed using factor analysis to investigate psychological influences on vaccination intention. RESULTS: Survey response rate was 56% (20,785/36,998) in December 2020 and 53% (20,590/38,727) in February 2021, with 14,880 adults reporting across both time points. In December 2020, 1,469 (10%) participants responded 'No' or 'Unsure'. Of these people, 1,266 (86%) changed their mind and responded 'Yes' or 'Already had a COVID-19 vaccine' by February 2021. Vaccination intention increased across all ethnic groups and levels of social deprivation. Age was most strongly associated with vaccination intention, with 16-24-year-olds more likely to respond "Unsure" or "No" versus "Yes" than 65-74-year-olds in December 2020 (OR: 4.63, 95Â %CI: 3.42, 6.27 & OR 7.17 95Â %CI: 4.26, 12.07 respectively) and February 2021 (OR: 27.92 95Â %CI: 13.79, 56.51 & OR 17.16 95Â %CI: 4.12, 71.55). The association between ethnicity and vaccination intention weakened, but did not disappear, over time. Both vaccine- and illness-related psychological factors were shown to influence vaccination intention. CONCLUSIONS: Four in five adults (86%) who were reluctant or intending to refuse a COVID-19 vaccine in December 2020 had changed their mind in February 2021 and planned to accept, or had already accepted, a vaccine
Climate change drives migratory range shift via individual plasticity in shearwaters.
How individual animals respond to climate change is key to whether populations will persist or go extinct. Yet, few studies investigate how changes in individual behavior underpin these population-level phenomena. Shifts in the distributions of migratory animals can occur through adaptation in migratory behaviors, but there is little understanding of how selection and plasticity contribute to population range shift. Here, we use long-term geolocator tracking of Balearic shearwaters (Puffinus mauretanicus) to investigate how year-to-year changes in individual birds' migrations underpin a range shift in the post-breeding migration. We demonstrate a northward shift in the post-breeding range and show that this is brought about by individual plasticity in migratory destination, with individuals migrating further north in response to changes in sea-surface temperature. Furthermore, we find that when individuals migrate further, they return faster, perhaps minimizing delays in return to the breeding area. Birds apparently judge the increased distance that they will need to migrate via memory of the migration route, suggesting that spatial cognitive mechanisms may contribute to this plasticity and the resulting range shift. Our study exemplifies the role that individual behavior plays in populations' responses to environmental change and highlights some of the behavioral mechanisms that might be key to understanding and predicting species persistence in response to climate change
Recommended from our members
Effect of Hydrocortisone on Mortality and Organ Support in Patients With Severe COVID-19: The REMAP-CAP COVID-19 Corticosteroid Domain Randomized Clinical Trial.
Importance: Evidence regarding corticosteroid use for severe coronavirus disease 2019 (COVID-19) is limited. Objective: To determine whether hydrocortisone improves outcome for patients with severe COVID-19. Design, Setting, and Participants: An ongoing adaptive platform trial testing multiple interventions within multiple therapeutic domains, for example, antiviral agents, corticosteroids, or immunoglobulin. Between March 9 and June 17, 2020, 614 adult patients with suspected or confirmed COVID-19 were enrolled and randomized within at least 1 domain following admission to an intensive care unit (ICU) for respiratory or cardiovascular organ support at 121 sites in 8 countries. Of these, 403 were randomized to open-label interventions within the corticosteroid domain. The domain was halted after results from another trial were released. Follow-up ended August 12, 2020. Interventions: The corticosteroid domain randomized participants to a fixed 7-day course of intravenous hydrocortisone (50 mg or 100 mg every 6 hours) (nâ=â143), a shock-dependent course (50 mg every 6 hours when shock was clinically evident) (nâ=â152), or no hydrocortisone (nâ=â108). Main Outcomes and Measures: The primary end point was organ support-free days (days alive and free of ICU-based respiratory or cardiovascular support) within 21 days, where patients who died were assigned -1 day. The primary analysis was a bayesian cumulative logistic model that included all patients enrolled with severe COVID-19, adjusting for age, sex, site, region, time, assignment to interventions within other domains, and domain and intervention eligibility. Superiority was defined as the posterior probability of an odds ratio greater than 1 (threshold for trial conclusion of superiority >99%). Results: After excluding 19 participants who withdrew consent, there were 384 patients (mean age, 60 years; 29% female) randomized to the fixed-dose (nâ=â137), shock-dependent (nâ=â146), and no (nâ=â101) hydrocortisone groups; 379 (99%) completed the study and were included in the analysis. The mean age for the 3 groups ranged between 59.5 and 60.4 years; most patients were male (range, 70.6%-71.5%); mean body mass index ranged between 29.7 and 30.9; and patients receiving mechanical ventilation ranged between 50.0% and 63.5%. For the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively, the median organ support-free days were 0 (IQR, -1 to 15), 0 (IQR, -1 to 13), and 0 (-1 to 11) days (composed of 30%, 26%, and 33% mortality rates and 11.5, 9.5, and 6 median organ support-free days among survivors). The median adjusted odds ratio and bayesian probability of superiority were 1.43 (95% credible interval, 0.91-2.27) and 93% for fixed-dose hydrocortisone, respectively, and were 1.22 (95% credible interval, 0.76-1.94) and 80% for shock-dependent hydrocortisone compared with no hydrocortisone. Serious adverse events were reported in 4 (3%), 5 (3%), and 1 (1%) patients in the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively. Conclusions and Relevance: Among patients with severe COVID-19, treatment with a 7-day fixed-dose course of hydrocortisone or shock-dependent dosing of hydrocortisone, compared with no hydrocortisone, resulted in 93% and 80% probabilities of superiority with regard to the odds of improvement in organ support-free days within 21 days. However, the trial was stopped early and no treatment strategy met prespecified criteria for statistical superiority, precluding definitive conclusions. Trial Registration: ClinicalTrials.gov Identifier: NCT02735707
Fludarabine, cytarabine, granulocyte colony-stimulating factor, and idarubicin with gemtuzumab ozogamicin improves event-free survival in younger patients with newly diagnosed aml and overall survival in patients with npm1 and flt3 mutations
Purpose
To determine the optimal induction chemotherapy regimen for younger adults with newly diagnosed AML without known adverse risk cytogenetics.
Patients and Methods
One thousand thirty-three patients were randomly assigned to intensified (fludarabine, cytarabine, granulocyte colony-stimulating factor, and idarubicin [FLAG-Ida]) or standard (daunorubicin and Ara-C [DA]) induction chemotherapy, with one or two doses of gemtuzumab ozogamicin (GO). The primary end point was overall survival (OS).
Results
There was no difference in remission rate after two courses between FLAG-Ida + GO and DA + GO (complete remission [CR] + CR with incomplete hematologic recovery 93% v 91%) or in day 60 mortality (4.3% v 4.6%). There was no difference in OS (66% v 63%; P = .41); however, the risk of relapse was lower with FLAG-Ida + GO (24% v 41%; P < .001) and 3-year event-free survival was higher (57% v 45%; P < .001). In patients with an NPM1 mutation (30%), 3-year OS was significantly higher with FLAG-Ida + GO (82% v 64%; P = .005). NPM1 measurable residual disease (MRD) clearance was also greater, with 88% versus 77% becoming MRD-negative in peripheral blood after cycle 2 (P = .02). Three-year OS was also higher in patients with a FLT3 mutation (64% v 54%; P = .047). Fewer transplants were performed in patients receiving FLAG-Ida + GO (238 v 278; P = .02). There was no difference in outcome according to the number of GO doses, although NPM1 MRD clearance was higher with two doses in the DA arm. Patients with core binding factor AML treated with DA and one dose of GO had a 3-year OS of 96% with no survival benefit from FLAG-Ida + GO.
Conclusion
Overall, FLAG-Ida + GO significantly reduced relapse without improving OS. However, exploratory analyses show that patients with NPM1 and FLT3 mutations had substantial improvements in OS. By contrast, in patients with core binding factor AML, outcomes were excellent with DA + GO with no FLAG-Ida benefit
PANC Study (Pancreatitis: A National Cohort Study): national cohort study examining the first 30 days from presentation of acute pancreatitis in the UK
Abstract
Background
Acute pancreatitis is a common, yet complex, emergency surgical presentation. Multiple guidelines exist and management can vary significantly. The aim of this first UK, multicentre, prospective cohort study was to assess the variation in management of acute pancreatitis to guide resource planning and optimize treatment.
Methods
All patients aged greater than or equal to 18 years presenting with acute pancreatitis, as per the Atlanta criteria, from March to April 2021 were eligible for inclusion and followed up for 30 days. Anonymized data were uploaded to a secure electronic database in line with local governance approvals.
Results
A total of 113 hospitals contributed data on 2580 patients, with an equal sex distribution and a mean age of 57 years. The aetiology was gallstones in 50.6 per cent, with idiopathic the next most common (22.4 per cent). In addition to the 7.6 per cent with a diagnosis of chronic pancreatitis, 20.1 per cent of patients had a previous episode of acute pancreatitis. One in 20 patients were classed as having severe pancreatitis, as per the Atlanta criteria. The overall mortality rate was 2.3 per cent at 30 days, but rose to one in three in the severe group. Predictors of death included male sex, increased age, and frailty; previous acute pancreatitis and gallstones as aetiologies were protective. Smoking status and body mass index did not affect death.
Conclusion
Most patients presenting with acute pancreatitis have a mild, self-limiting disease. Rates of patients with idiopathic pancreatitis are high. Recurrent attacks of pancreatitis are common, but are likely to have reduced risk of death on subsequent admissions.
</jats:sec
Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19
IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19.
Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19.
DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 nonâcritically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022).
INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (nâ=â257), ARB (nâ=â248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; nâ=â10), or no RAS inhibitor (control; nâ=â264) for up to 10 days.
MAIN OUTCOMES AND MEASURES The primary outcome was organ supportâfree days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes.
RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ supportâfree days among critically ill patients was 10 (â1 to 16) in the ACE inhibitor group (nâ=â231), 8 (â1 to 17) in the ARB group (nâ=â217), and 12 (0 to 17) in the control group (nâ=â231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ supportâfree days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively).
CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes.
TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570
Improving the ecological outcomes of compensatory conservation by addressing governance gaps: a case study of Biodiversity Net Gain in England
Biodiversity compensation policies have emerged around the world to address the ecological harms of infrastructure expansion, but they have historically experienced weak compliance. The English government is introducing a requirement that all new infrastructure developments demonstrate they achieve a Biodiversity Net Gain (BNG). Previous research has highlighted governance gaps that risk undermining the policyâs ecological outcomes, as well as exploring the risks caused by fundamental capacity constraints in regulators, but the magnitude of their effects on the policyâs potential impacts on biodiversity remains unexplored. We collated BNG information from all new major developments across six early adopter councils from 2020-2022. We quantified the proportion of the biodiversity outcomes promised under BNG which are at risk of non-compliance, explored the variation in strategies that developments use to meet their biodiversity liabilities, and quantified the occurrence of simple errors in the biodiversity metric calculations submitted by project proponents. Large developments and energy infrastructure were more likely to meet their liability within their own development footprint, and small developments more likely to purchase offsets. We estimate that 27% of all biodiversity units fall within governance gaps that expose them to a high risk of non-compliance. Should these units instead be delivered through the off-site biodiversity offsetting market â which is associated with relatively more robust governance mechanisms â we estimate that the demand for offsets could rise by a factor of four, increasing the financial contributions generated by BNG for conservation activities on private land. Lastly, we find that 21% of applications contained conspicuous errors in their BNG calculations, half of which have already been accepted by councils, hinting at under-resourcing in councils assessing developments. Our findings demonstrate that resourcing and governance shortfalls risk undermining the policyâs effectiveness at halting biodiversity loss and require addressing to ensure the policy benefits nature