71 research outputs found

    Oxygen uptake and denitrification in soil aggregates

    Get PDF
    A mathematical model of oxygen uptake by bacteria in agricultural soils is presented with the goal of predicting anaerobic regions in which denitrification occurs. In an environment with a plentiful supply of oxygen, microorganisms consume oxygen through normal respiration. When the local oxygen concentration falls below a threshold level, denitrification may take place leading to the release of nitrous oxide, a potent agent for global warming. A two-dimensional model is presented in which one or more circular soil aggregates are located at a distance below the ground-level at which the prevailing oxygen concentration is prescribed. The level of denitrification is estimated by computing the area of any anaerobic cores which may develop in the interior of the aggregates. The oxygen distribution throughout the model soil is calculated first for an aggregated soil for which the ratio of the oxygen diffusivities between an aggregate and its surround is small via an asymptotic analysis. Second, the case of a non-aggregated soil featuring one or more microbial hotspots, for which the diffusion ratio is arbitrary, is examined numerically using the boundary-element method. Calculations with multiple aggregates demonstrate a sheltering effect whereby some aggregates receive less oxygen than their neighbours. In the case of an infinite regular triangular network representing an aggregated soil, it is shown that there is an optimal inter-aggregate spacing which minimises the total anaerobic core area

    Simple Ways to Measure Behavioral Responses of Drosophila to Stimuli and Use of These Methods to Characterize a Novel Mutant

    Get PDF
    The behavioral responses of adult Drosophila fruit flies to a variety of sensory stimuli – light, volatile and non-volatile chemicals, temperature, humidity, gravity, and sound - have been measured by others previously. Some of those assays are rather complex; a review of them is presented in the Discussion. Our objective here has been to find out how to measure the behavior of adult Drosophila fruit flies by methods that are inexpensive and easy to carry out. These new assays have now been used here to characterize a novel mutant that fails to be attracted or repelled by a variety of sensory stimuli even though it is motile

    Egg removal and intraspecific brood parasitism in the European starling ( Sturnus vulgaris )

    Full text link
    From 1983 to 1986 we monitored 284 European starling ( Sturnus vulgaris ) nests in New Jersey for evidence of intraspecific brood parasitism and egg removal during the laying period. Egg removal occurred significantly more often at nests where intraspecific brood parasitism was detected (12 of 35 nests, 34%) than at unparasitized nests (23 of 249 nests, 9%). Brood parasitism (92% of parasitized nests) and egg removal (74% of nests with egg removal) were most common at nests where egg laying began in April of each year (i.e., early nests). Egg removal occurred at 26 (19%) and brood parasitism at 32 (23%) of 138 early nests. Both brood parasitism and egg removal were concentrated during the first four days in the laying period when brood parasitism is most likely to be successful and when host nests are most vulnerable to parasitism (Romagnano 1987). Both parasitism and removal usually involved a single egg at each nest. We detected brood parasitism and egg removal on the same day at five of 12 nests (42%) where both were observed. Because starlings do not remove foreign eggs from their nests once they begin laying (Stouffer et al. 1987) we hypothesize that parasite females sometimes removed host eggs while parasitizing nests.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/46889/1/265_2004_Article_BF00295201.pd

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Exploring diversity in soil fertility management of smallholder farms in western Kenya. II. Within-farm variability in resource allocation, nutrient flows and soil fertility status

    No full text
    Strong gradients of decreasing soil fertility are found with increasing distance from the homestead within smallholder African farms, due to differential resource allocation. As nutrient use efficiency varies strongly along these gradients, such heterogeneity must be considered when designing soil management strategies, aimed at an improved overall resource use efficiency at farm scale. Here, we quantify the magnitude and study the origin of farmer-induced, within-farm soil fertility gradients as affected by biophysical and socio-economic conditions, and investigate farmers’ perceptions of such heterogeneity. Farm transects, participatory resource flow mapping, farmers’ classification of land qualities, and soil sampling for both chemical and spectral reflectance analyses were performed across 60 farms in three sub-locations (Emuhaia, Shinyalu, Aludeka) representing the variability found in the highlands of western Kenya. Differences between the various field types of a farm were observed for input use (e.g. 0.7–104 kg N ha?1), food production (e.g. 0.6–2.9 t DM ha?1), partial C (e.g. ?570 to 1480 kg ha?1) and N (e.g. ?92 to 57 kg ha?1) balances and general soil fertility status, despite strong differences across sub-locations. Concentration of nutrients in the home fields compared with the remote fields were verified for extractable P (e.g. 2.1–19.8 mg kg?1) and secondarily for exchangeable K (e.g. 0.14–0.54 cmol(+) kg?1), on average, whereas differences for soil C and N were only important when considering each individual farm separately. Farmers managed their fields according to their perceived land quality, varying the timing and intensity of management practices along soil fertility gradients. Fields classified by them as poor were planted later (up to 33.6 days of delay), with sparser crops (ca. 30% less plants m?2) and had higher weed infestation levels than those classified as fertile, leading to important differences in maize yield (e.g. 0.9 versus 2.4 t ha?1). The internal heterogeneity in resource allocation varied also between farms of different social classes, according to their objectives and factor constraints. Additionally, the interaction of sub-location-specific socio-economic (population, markets) and biophysical factors (soilscape variability) determined the patterns of resource allocation to different activities. Such interactions need to be considered for the characterisation of farming system to facilitate targeting research and development interventions to address the problem of poor soil fertility
    • …
    corecore