15 research outputs found
Mammal responses to global changes in human activity vary by trophic group and landscape
Wildlife must adapt to human presence to survive in the Anthropocene, so it is critical to understand species responses to humans in different contexts. We used camera trapping as a lens to view mammal responses to changes in human activity during the COVID-19 pandemic. Across 163 species sampled in 102 projects around the world, changes in the amount and timing of animal activity varied widely. Under higher human activity, mammals were less active in undeveloped areas but unexpectedly more active in developed areas while exhibiting greater nocturnality. Carnivores were most sensitive, showing the strongest decreases in activity and greatest increases in nocturnality. Wildlife managers must consider how habituation and uneven sensitivity across species may cause fundamental differences in humanâwildlife interactions along gradients of human influence.Peer reviewe
Recommended from our members
Effect of Hydrocortisone on Mortality and Organ Support in Patients With Severe COVID-19: The REMAP-CAP COVID-19 Corticosteroid Domain Randomized Clinical Trial.
Importance: Evidence regarding corticosteroid use for severe coronavirus disease 2019 (COVID-19) is limited. Objective: To determine whether hydrocortisone improves outcome for patients with severe COVID-19. Design, Setting, and Participants: An ongoing adaptive platform trial testing multiple interventions within multiple therapeutic domains, for example, antiviral agents, corticosteroids, or immunoglobulin. Between March 9 and June 17, 2020, 614 adult patients with suspected or confirmed COVID-19 were enrolled and randomized within at least 1 domain following admission to an intensive care unit (ICU) for respiratory or cardiovascular organ support at 121 sites in 8 countries. Of these, 403 were randomized to open-label interventions within the corticosteroid domain. The domain was halted after results from another trial were released. Follow-up ended August 12, 2020. Interventions: The corticosteroid domain randomized participants to a fixed 7-day course of intravenous hydrocortisone (50 mg or 100 mg every 6 hours) (nâ=â143), a shock-dependent course (50 mg every 6 hours when shock was clinically evident) (nâ=â152), or no hydrocortisone (nâ=â108). Main Outcomes and Measures: The primary end point was organ support-free days (days alive and free of ICU-based respiratory or cardiovascular support) within 21 days, where patients who died were assigned -1 day. The primary analysis was a bayesian cumulative logistic model that included all patients enrolled with severe COVID-19, adjusting for age, sex, site, region, time, assignment to interventions within other domains, and domain and intervention eligibility. Superiority was defined as the posterior probability of an odds ratio greater than 1 (threshold for trial conclusion of superiority >99%). Results: After excluding 19 participants who withdrew consent, there were 384 patients (mean age, 60 years; 29% female) randomized to the fixed-dose (nâ=â137), shock-dependent (nâ=â146), and no (nâ=â101) hydrocortisone groups; 379 (99%) completed the study and were included in the analysis. The mean age for the 3 groups ranged between 59.5 and 60.4 years; most patients were male (range, 70.6%-71.5%); mean body mass index ranged between 29.7 and 30.9; and patients receiving mechanical ventilation ranged between 50.0% and 63.5%. For the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively, the median organ support-free days were 0 (IQR, -1 to 15), 0 (IQR, -1 to 13), and 0 (-1 to 11) days (composed of 30%, 26%, and 33% mortality rates and 11.5, 9.5, and 6 median organ support-free days among survivors). The median adjusted odds ratio and bayesian probability of superiority were 1.43 (95% credible interval, 0.91-2.27) and 93% for fixed-dose hydrocortisone, respectively, and were 1.22 (95% credible interval, 0.76-1.94) and 80% for shock-dependent hydrocortisone compared with no hydrocortisone. Serious adverse events were reported in 4 (3%), 5 (3%), and 1 (1%) patients in the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively. Conclusions and Relevance: Among patients with severe COVID-19, treatment with a 7-day fixed-dose course of hydrocortisone or shock-dependent dosing of hydrocortisone, compared with no hydrocortisone, resulted in 93% and 80% probabilities of superiority with regard to the odds of improvement in organ support-free days within 21 days. However, the trial was stopped early and no treatment strategy met prespecified criteria for statistical superiority, precluding definitive conclusions. Trial Registration: ClinicalTrials.gov Identifier: NCT02735707
Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19
IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19.
Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19.
DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 nonâcritically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022).
INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (nâ=â257), ARB (nâ=â248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; nâ=â10), or no RAS inhibitor (control; nâ=â264) for up to 10 days.
MAIN OUTCOMES AND MEASURES The primary outcome was organ supportâfree days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes.
RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ supportâfree days among critically ill patients was 10 (â1 to 16) in the ACE inhibitor group (nâ=â231), 8 (â1 to 17) in the ARB group (nâ=â217), and 12 (0 to 17) in the control group (nâ=â231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ supportâfree days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively).
CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes.
TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570
Bioenergetic Balance of Continuous Venovenous Hemofiltration, a Retrospective Analysis
(1) Background: Nutrition therapy guided by indirect calorimetry (IC) is the gold standard and is associated with lower morbidity and mortality in critically ill patients. When performing IC during continuous venovenous hemofiltration (CVVH), the measured VCO2 should be corrected for the exchanged CO2 to calculate the âtrueâ Resting Energy Expenditure (REE). After the determination of the true REE, the caloric prescription should be adapted to the removal and addition of non-intentional calories due to citrate, glucose, and lactate in dialysis fluids to avoid over- and underfeeding. We aimed to evaluate this bioenergetic balance during CVVH and how nutrition therapy should be adapted. (2) Methods: This post hoc analysis evaluated citrate, glucose, and lactate exchange. Bioenergetic balances were calculated based on these values during three different CVVH settings: low dose with citrate, high dose with citrate, and low dose without citrate. The caloric load of these non-intentional calories during a CVVH-run was compared to the true REE. (3) Results: We included 19 CVVH-runs. The bioenergetic balance during the low dose with citrate was 498 ± 110 kcal/day (range 339 to 681 kcal/day) or 26 ± 9% (range 14 to 42%) of the true REE. During the high dose with citrate, it was 262 ± 222 kcal/day (range 56 to 262 kcal/day) or 17 ± 11% (range 7 to 32%) of the true REE. During the low dose without citrate, the bioenergetic balance was â189 ± 77 kcal/day (range â298 to â92 kcal/day) or â13 ± 8% (range â28 to â5%) of the true REE. (4) Conclusions: Different CVVH settings resulted in different bioenergetic balances ranging from â28% up to +42% of the true REE depending on the CVVH fluids chosen. When formulating a caloric prescription during CVVH, an individual approach considering the impact of these non-intentional calories is warranted
Risk assessment of nitrate leaching in autumn after incorporation of catch crops in spring
In Flanders, the current environmental policy allows farmers to apply manure after harvest of winter cereals at a maximum rate of 60 kg N ha-1, if a catch crop is sown before September. The N uptake by the catch crop is assumed to prevent nitrate leaching during autumn and winter. After incorporation of the catch crop, its easily decomposable N is mineralized and is expected to become available for the next spring crop. N mineralization was measured through incubation experiments at 15°C for white mustard (Sinapsis alba), Italian ryegrass (Lolium multiflorum), black oat (Avena strigosa) and a grass-clover mixture (Lolium perenne, Trifolium repens, Trifolium pratense). The relative amount of N mineralized after 3 months was negatively correlated with the C:N-ratio of the catch crops. Immobilization was important and long lasting for black oat and to a lesser extent for white mustard. N release from these catch crops could therefore occur late in the season and thus delay the risk of nitrate leaching. To assess effects of sowing date and manure application rates, soil mineral N content in summer and autumn will be monitored for corn grown after catch crop incorporation in spring. Both black oat and white mustard were sown on 2 dates and pig slurry was applied at 3 different rates. Fallow treatments were included. Soil mineral N content and aboveground N were determined before incorporation in April. Mineral N profiles will be used to assess whether delayed risks of nitrate leaching exist with these specific catch crops
Evolution of soil mineral nitrogen under different catch crops fertilized with pig slurry after harvest of winter cereals in the north of Belgium
In temperate humid climates, catch crops have been proven to be a useful tool in the abatement of soil erosion, nutrient leaching and soil organic carbon losses. In Flanders (northern Belgium), the environmental policy allows farmers to apply manure after harvest at a rate of 60 kg N ha-1, if they sow a catch crop before the 1st of September (on light textures) or 15th of October (on heavy clay). This will only remain possible if nitrate leaching losses are not larger than when no additional manure is applied. To this end, field experiments were set up on 4 locations with different soil textures. Winter cereals were followed by 4 different catch crops: white mustard (Sinapsis alba), Italian ryegrass (Lolium multiflorum), black oat (Avena strigosa) and a grass-clover mixture (Lolium perenne, Trifolium repens, Trifolium pratense), sown at 2 different dates, 2-4 weeks apart. Before sowing, pig slurry was applied in 2 rates, corresponding to about 60 and 120 kg N ha-1 (60N, 120N), and a control without manure application was included (0N). Weather conditions were exceptionally favorable for catch crop growth. Soil mineral nitrogen content in the 0-90 cm layer (Nmin) was monitored during autumn 2011, winter and early spring 2012. From October to January, Nmin was significantly larger (p < 0.05) on the fallow plots compared to the plots with early sown catch crops for all cases; this was not always the case for late sown catch crops: in particular Italian ryegrass and the grass-clover mixture had relatively high Nmin contents. The effect of manure application depended strongly on sowing date too: under early sown catch crops Nmin showed no significant differences between 0N, 60N and 120N, with the exception of 1 single case. For late sown catch crops, Nmin was in some cases significantly different between 0N and 120N treatments, but never for white mustard. In early spring, Nmin was only significantly different between the fallow plots and the plots with frost resistant catch crops (Italian ryegrass and grass-clover mixture). In conclusion, an additional manure application of 60 kg N ha-1 after harvest of winter cereals did not significantly increase Nmin under catch crops, allowing the assumption that it neither increased nitrate leaching losses