3,712 research outputs found
Nitrous Oxide Emissions
End of project reportNitrous oxide (N2O) is one of the three most important greenhouse gases (GHG). Nitrous oxide emissions currently account for approximately one third of GHG emissions from agriculture in Ireland. Emissions of N2O arise naturally from soil sources and from the application of nitrogen (N) in the form of N fertilizers and N in dung and urine deposition by grazing animals at pasture.
Nitrous oxide emission measurements were conducted at three different scales. Firstly, a large-scale field experiment was undertaken to compare emission rates from a pasture receiving three different rates of N fertilizer application and to identify the effects of controlling variables over a two-year period. Variation in emission rates was large both within and between years.
Two contrasting climatic years were identified. The cooler and wetter conditions in year 1 gave rise to considerably lower emission levels than the warmer and drier year 2. However, in both years, peak emissions were associated with fertilizer N applications coincident with rainfall events in the summer months.
A small-plot study was conducted to identify the individual and combined effects of fertilizer, dung and urine applications to grassland. Treatment effects were however, difficult to obtain due to the overriding effects of environmental variables.
Thirdly, through the use of a small-scale mini-lysimeter study, the diurnal nature of N2O emission rates was identified for two distinct periods during the year. The occurrence of a diurnal pattern has important implications for the identification of a measurement period during the day which is representative of the true daily flux.
The research presented aims to identify the nature and magnitude of N2O emissions and the factors which affect emission rates from a grassland in Ireland. Further work is required to integrate the effects of different soil types and contrasting climatic regimes across soil types on N2O emissions.Environmental Protection Agenc
Maximin optimal cluster randomized designs for assessing treatment effect heterogeneity
Cluster randomized trials (CRTs) are studies where treatment is randomized at
the cluster level but outcomes are typically collected at the individual level.
When CRTs are employed in pragmatic settings, baseline population
characteristics may moderate treatment effects, leading to what is known as
heterogeneous treatment effects (HTEs). Pre-specified, hypothesis-driven HTE
analyses in CRTs can enable an understanding of how interventions may impact
subpopulation outcomes. While closed-form sample size formulas have recently
been proposed, assuming known intracluster correlation coefficients (ICCs) for
both the covariate and outcome, guidance on optimal cluster randomized designs
to ensure maximum power with pre-specified HTE analyses has not yet been
developed. We derive new design formulas to determine the cluster size and
number of clusters to achieve the locally optimal design (LOD) that minimizes
variance for estimating the HTE parameter given a budget constraint. Given the
LODs are based on covariate and outcome-ICC values that are usually unknown, we
further develop the maximin design for assessing HTE, identifying the
combination of design resources that maximize the relative efficiency of the
HTE analysis in the worst case scenario. In addition, given the analysis of the
average treatment effect is often of primary interest, we also establish
optimal designs to accommodate multiple objectives by combining considerations
for studying both the average and heterogeneous treatment effects. We
illustrate our methods using the context of the Kerala Diabetes Prevention
Program CRT, and provide an R Shiny app to facilitate calculation of optimal
designs under a wide range of design parameters.Comment: 25 pages, 6 figures, 5 tables, 3 appendices; clarified phrasing,
typos correcte
Teaching reading through imaginative play situations in grades one and two
Thesis (Ed.M.)--Boston Universit
Group sequential two-stage preference designs
The two-stage preference design (TSPD) enables the inference for treatment
efficacy while allowing for incorporation of patient preference to treatment.
It can provide unbiased estimates for selection and preference effects, where a
selection effect occurs when patients who prefer one treatment respond
differently than those who prefer another, and a preference effect is the
difference in response caused by an interaction between the patient's
preference and the actual treatment they receive. One potential barrier to
adopting TSPD in practice, however, is the relatively large sample size
required to estimate selection and preference effects with sufficient power. To
address this concern, we propose a group sequential two-stage preference design
(GS-TSPD), which combines TSPD with sequential monitoring for early stopping.
In the GS-TSPD, pre-planned sequential monitoring allows investigators to
conduct repeated hypothesis tests on accumulated data prior to full enrollment
to assess study eligibility for early trial termination without inflating type
I error rates. Thus, the procedure allows investigators to terminate the study
when there is sufficient evidence of treatment, selection, or preference
effects during an interim analysis, thereby reducing the design resource in
expectation. To formalize such a procedure, we verify the independent
increments assumption for testing the selection and preference effects and
apply group sequential stopping boundaries from the approximate sequential
density functions. Simulations are then conducted to investigate the operating
characteristics of our proposed GS-TSPD compared to the traditional TSPD. We
demonstrate the applicability of the design using a study of Hepatitis C
treatment modality.Comment: 27 pages, 7 tables, 5 figures, 4 appendices; under review at
Statistics in Medicin
Perch, Perca fluviatilis show a directional preference for, but do not increase attacks toward, prey in response to water-borne cortisol
In freshwater environments, chemosensory cues play an important role in predatorprey interactions. Prey use a variety of chemosensory cues to detect and avoid predators. However, whether predators use the chemical cues released by disturbed or stressed prey has received less attention. Here we tested the hypothesis that the disturbance cue cortisol, in conjunction with visual cues of prey, elevates predatory behavior. We presented predators (perch, Perca fluviatilis) with three chemosensory choice tests and recorded their location, orientation, and aggressive behavior. We compared the responses of predators when provided with (i) visual cues of prey only (two adjacent tanks containing sticklebacks); (ii) visual and natural chemical cues of prey vs. visual cues only; and (iii) visual cues of prey with cortisol vs. visual cues only. Perch spent a significantly higher proportion of time in proximity to prey, and orientated toward prey more, when presented with a cortisol stimulus plus visual cues, relative to presentations of visual and natural chemical cues of prey, or visual cues of prey only. There was a trend that perch directed a higher proportion of predatory behaviors (number of lunges) toward sticklebacks when presented with a cortisol stimulus plus visual cues, relative to the other chemosensory conditions. But they did not show a significant increase in total predatory behavior in response to cortisol. Therefore, it is not clear whether water-borne cortisol, in conjunction with visual cues of prey, affects predatory behavior. Our results provide evidence that cortisol could be a source of public information about prey state and/or disturbance, but further work is required to confirm this
Weight Gain and Decreased Sleep Duration in First-Year College Students: A Longitudinal Study
Poster from the 2017 Food & Nutrition Conference & Expo. Poster Session: Professional Skills; Nutrition Assessment & Diagnosis; Medical Nutrition Therapy
Recommended from our members
Diagnostic test interpretation and referral delay in patients with interstitial lung disease.
BACKGROUND:Diagnostic delays are common in patients with interstitial lung disease (ILD). A substantial percentage of patients experience a diagnostic delay in the primary care setting, but the factors underpinning this observation remain unclear. In this multi-center investigation, we assessed ILD reporting on diagnostic test interpretation and its association with subsequent pulmonology referral by a primary care physician (PCP). METHODS:A retrospective cohort analysis of patients referred to the ILD programs at UC-Davis and University of Chicago by a PCP within each institution was performed. Computed tomography (CT) of the chest and abdomen and pulmonary function test (PFT) were reviewed to identify the date ILD features were first present and determine the time from diagnostic test to pulmonology referral. The association between ILD reporting on diagnostic test interpretation and pulmonology referral was assessed, as was the association between years of diagnostic delay and changes in fibrotic features on longitudinal chest CT. RESULTS:One hundred and forty-six patients were included in the final analysis. Prior to pulmonology referral, 66% (n = 97) of patients underwent chest CT, 15% (n = 21) underwent PFT and 15% (n = 21) underwent abdominal CT. ILD features were reported on 84, 62 and 33% of chest CT, PFT and abdominal CT interpretations, respectively. ILD reporting was associated with shorter time to pulmonology referral when undergoing chest CT (1.3 vs 15.1 months, respectively; p = 0.02), but not PFT or abdominal CT. ILD reporting was associated with increased likelihood of pulmonology referral within 6 months of diagnostic test when undergoing chest CT (rate ratio 2.17, 95% CI 1.03-4.56; p = 0.04), but not PFT or abdominal CT. Each year of diagnostic delay was associated with a 1.8% increase in percent fibrosis on chest CT. Patients with documented dyspnea had shorter time to chest CT acquisition and pulmonology referral than patients with documented cough and lung crackles. CONCLUSIONS:Determinants of ILD diagnostic delays in the primary care setting include underreporting of ILD features on diagnostic testing and prolonged time to pulmonology referral even when ILD is reported. Interventions to modulate these factors may reduce ILD diagnostic delays in the primary care setting
Suturing Workshop for Third-Year Medical Students: A Modern Educational Approach
Background: This study sought to determine if developing suturing workshops based on modern educational theory would lead to a significant increase in third-year medical students’ confidence and preparedness as compared to before the workshop.
Methods: A group of pre-clinical, third-year medical students (n = 20) were voluntarily recruited. The workshop consisted of an interactive didactic session, a hands-on suturing session, and a question-answer session with surgeons. The nine-point Likert scale surveys were given pre-and post-workshop to 17 participants. Total scores of “confidence” and “preparedness” were analyzed using the Student t-test. Results of Q-Q plot and normality tests were used to validate the normality assumption. All analysis was conducted using SAS Software 9.4 (Cary, North Carolina).
Results: A statistically significant increase in both confidence and preparedness was found between results of pre- and post-workshop surveys. Average total scores in confidence increased by 19.7 points, from 19.3 to 39 (95% CI: 15.0-24.4; P value \u3c 0.001). For scores in preparedness, the total score increased by an average of 18.4 points, from 22.8 to 41.2 (95% CI: 14.1-22.8; P value \u3c 0.001).
Conclusions: These findings suggest that a structured course based on modern educational theory can increase both the confidence and preparedness of third-year medical students who are matriculating into their hospitalbased clerkships
- …