2,467 research outputs found
Recommended from our members
Experimental In-Field Transfer and Survival of Escherichia coli from Animal Feces to Romaine Lettuce in Salinas Valley, California.
This randomized controlled trial characterized the transfer of E. coli from animal feces and/or furrow water onto adjacent heads of lettuce during foliar irrigation, and the subsequent survival of bacteria on the adaxial surface of lettuce leaves. Two experiments were conducted in Salinas Valley, California: (1) to quantify the transfer of indicator E. coli from chicken and rabbit fecal deposits placed in furrows to surrounding lettuce heads on raised beds, and (2) to quantify the survival of inoculated E. coli on Romaine lettuce over 10 days. E. coli was recovered from 97% (174/180) of lettuce heads to a maximal distance of 162.56 cm (5.33 ft) from feces. Distance from sprinklers to feces, cumulative foliar irrigation, and lettuce being located downwind of the fecal deposit were positively associated, while distance from fecal deposit to lettuce was negatively associated with E. coli transference. E. coli exhibited decimal reduction times of 2.2 and 2.5 days when applied on the adaxial surface of leaves within a chicken or rabbit fecal slurry, respectively. Foliar irrigation can transfer E. coli from feces located in a furrow onto adjacent heads of lettuce, likely due to the kinetic energy of irrigation droplets impacting the fecal surface and/or impacting furrow water contaminated with feces, with the magnitude of E. coli enumerated per head of lettuce influenced by the distance between lettuce and the fecal deposit, cumulative application of foliar irrigation, wind aspect of lettuce relative to feces, and time since final irrigation. Extending the time period between foliar irrigation and harvest, along with a 152.4 cm (5 ft) no-harvest buffer zone when animal fecal material is present, may substantially reduce the level of bacterial contamination on harvested lettuce
Impaired glucose tolerance or newly diagnosed diabetes mellitus diagnosed during admission adversely affects prognosis after myocardial infarction: An observational study
Objective To investigate the prognostic effect of newly diagnosed diabetes mellitus (NDM) and impaired glucose tolerance (IGT) post myocardial infarction (MI). Research Design and Methods Retrospective cohort study of 768 patients without preexisting diabetes mellitus post-MI at one centre in Yorkshire between November 2005 and October 2008. Patients were categorised as normal glucose tolerance (NGT n = 337), IGT (n = 279) and NDM (n = 152) on predischarge oral glucose tolerance test (OGTT). Primary end-point was the first occurrence of major adverse cardiovascular events (MACE) including cardiovascular death, non-fatal MI, severe heart failure (HF) or non-haemorrhagic stroke. Secondary end-points were all cause mortality and individual components of MACE. Results Prevalence of NGT, impaired fasting glucose (IFG), IGT and NDM changed from 90%, 6%, 0% and 4% on fasting plasma glucose (FPG) to 43%, 1%, 36% and 20% respectively after OGTT. 102 deaths from all causes (79 as first events of which 46 were cardiovascular), 95 non fatal MI, 18 HF and 9 non haemorrhagic strokes occurred during 47.2 ± 9.4 months follow up. Event free survival was lower in IGT and NDM groups. IGT (HR 1.54, 95% CI: 1.06–2.24, p = 0.024) and NDM (HR 2.15, 95% CI: 1.42–3.24, p = 0.003) independently predicted MACE free survival. IGT and NDM also independently predicted incidence of MACE. NDM but not IGT increased the risk of secondary end-points. Conclusion Presence of IGT and NDM in patients presenting post-MI, identified using OGTT, is associated with increased incidence of MACE and is associated with adverse outcomes despite adequate secondary prevention
A Claims Analysis of the Utilization of Tramadol for Acute Pain in Patients Prescribed Buprenorphine/Naloxone for Opioid Use Disorder
Objective: To determine the prevalence of tramadol prescribing among commercially insured adults receiving medication assisted therapy (MAT) with buprenorphine/naloxone.
Design: We conducted a cross-sectional descriptive study to evaluate the use of tramadol among patients prescribed buprenorphine/suboxone for MAT.
Setting: This study utilized data from 2010 to 2013 Optum Clinformatics Data Mart (OptumInsight, Eden Prairie, MN). This cohort is an administrative health claims database from a large national insurer. This data included pharmacy and medical care utilization and information describing patient enrollment.
Patients, Participants: Patients were 12 to 64 years of age and had complete and available medical, pharmacy and administrative records in the Optum Clinformatics Data Mart during study period.
Main Outcome Measures: Patients who received at least one paid claim for buprenorphine/naloxone from 2010 to 2013 and also received at least one overlapping pharmacy dispensing for tramadol were identified for analysis. We determined if the concurrent buprenorphine/naloxone and tramadol dispensings were from the same or a different prescriber.
Results: In this analysis of 18,734 U.S. commercially insured patients receiving MAT with buprenorphine/naloxone, we identified 1,198 (6.4%) patients who received at least one overlapping dispensing for tramadol during a four-year period spanning 2010 through 2013. Among these patients, 266 (1.42%) were co-prescribed buprenorphine/naloxone and tramadol from the same provider.
Conclusions: These results suggest that the use of tramadol among patients receiving buprenorphine/naloxone is not uncommon. Further study is warranted to further determine the benefits and risks associated with the use of tramadol for pain management among patients prescribed buprenorphine/naloxone
Environmental Evaluation Report onn Various Completed Channel Improvement Projects in Eastern Arkansas
The objective of this report is to evaluate the beneficial and adverse effects that certain channel improvement projects have had on the natural or man-made environments of selected areas in eastern Arkansas. This evaluation will be used as a baseline for determining the immediate and long-term effects that a project may have on the existing environment of the Village Creek Basin
Short-Eared Owl Land-Use Associations During the Breeding Season in the Western United States
The Short-eared Owl (Asio flammeus) is a species of conservation concern in the western USA, with evidence for declining population sizes. Monitoring of Short-eared Owls is complicated because of their low site fidelity and nomadic movements. We recruited community-science participants to implement a multi-year survey of Short-eared Owls across eight states in the western USA, resulting in a program of sufficient temporal and spatial dimensions to overcome many of the challenges in monitoring this species. We implemented both multi-scale occupancy and colonization/extinction modeling to provide insights into land-cover use, and to identify which cover types supported higher occurrence. Short-eared Owls were associated with native and anthropogenic land-cover types, but site occupancy varied among these categories and at different scales. Native grasslands, marsh/riparian, hay/fallow agriculture, and cultivated croplands were occupied most consistently across years. Occupancy rates differed at different scales (e.g., marsh/ riparian was the only land-cover type positively associated with occupancy at both transect and point scales). Contrary to expectations, native shrubland was negatively associated with occupancy at the point scale, and exhibited low colonization and high extinction rates. Our results suggest that conserving native landscapes in general, and grasslands, marsh, and riparian areas specifically, would benefit Short-eared Owls. Furthermore, Short-eared Owl occupancy was positively associated with hay/fallow land-cover types, suggesting that some nonnative land-cover types can function as Short-eared Owl habitat. Lastly, our results highlight how developing a broad-scale community science survey can inform conservation for a species not well monitored by existing survey programs
Assessing the potential for Salmonella growth in rehydrated dry dog food
A substantial percentage of dog owners add water to dry dog food to increase its palatability. The recent association of Salmonella contamination of dry pet foods with salmonellosis cases in both dogs and their owners has generated a need to determine the ability of Salmonella to grow in eight commercial brands of rehydrated dry dog food. Eight brands of commercial dry dog food were rehydrated to 20, 35 and 50% added moisture, inoculated with two S. enterica strains (~105 CFU/g) and incubated for 72 h at 18 °C, 22 °C, or 28 °C. Dog food brand, moisture content, and temperature affected pathogen growth/survival patterns. Rehydration to 20% moisture did not support growth of S. enterica, and in general there was a 0.5–2.0 Log decline. At 35% moisture and 28 °C, 4 of 8 brands supported up to 3.4 Log(CFU/g) of growth, while Salmonella levels declined in three brands, and remained unchanged in one. Rehydration to 50% moisture at 28 °C supported increases of up to 4.6 Log(CFU/g) in 5 of 8 brands. Growth kinetics determinations with two of the brands that supported growth had calculated lag times, generation times, and maximum population densities of 4.4 and 2.2 h, 1.4 and 10.8 h, and 7.3 and 6.9 Log(CFU/g) when rehydrated to 35% moisture and held at 30 °C. Results of this study establish that the rehydration of dry dog food with sufficient amounts of water may support the growth of S. enterica. Based on the most rapid observed lag times, growth of Salmonella, if present, in rehydrated dog food could be avoided by discarding or refrigerating uneaten portions within 2–3 h of rehydration. These data allow accidental or intentional rehydration of dry dog food to be factored into predictive microbiology models and exposure assessments.https://doi.org/10.1186/s40550-016-0043-
Classification and Ranking of Selectd Arkansas Lakes
Trophic-state related problems associated with waters in the United States have generated tremendous public interest and concern, particularly during the past decade. These interests and concerns led to Public Law 92-500, the mandate by Congress known as the Federal Water Pollution Control Act. Various sections of PL 92-500 directly address the need for trophic-state analyses, particularly Section 314 referred to as the Clean Lakes Program which assigns states the responsibility for classifying their lakes according to water quality, identifying methods of pollution control and restoring those lakes which have become degraded
Recommended from our members
Status of Oregon's Bull Trout.
Limited historical references indicate that bull trout Salvelinus confluentus in Oregon were once widely spread throughout at least 12 basins in the Klamath River and Columbia River systems. No bull trout have been observed in Oregon's coastal systems. A total of 69 bull trout populations in 12 basins are currently identified in Oregon. A comparison of the 1991 bull trout status (Ratliff and Howell 1992) to the revised 1996 status found that 7 populations were newly discovered and 1 population showed a positive or upgraded status while 22 populations showed a negative or downgraded status. The general downgrading of 32% of Oregon's bull trout populations appears largely due to increased survey efforts and increased survey accuracy rather than reduced numbers or distribution. However, three populations in the upper Klamath Basin, two in the Walla Walla Basin, and one in the Willamette Basin showed decreases in estimated population abundance or distribution
- …