387 research outputs found

    Pathways for nutrient loss to water with emphasis on phosphorus

    Get PDF
    Teagasc wishes to acknowledge the support of the Environmental Research Technological Development and Innovation (ERTDI) Programme under the Productive Sector Operational Programme which was financed by the Irish Government under the National Development Plan 2000-2006.End of project reportThe main objective of this project was to study phosphorus (P) loss from agricultural land under a range of conditions in Ireland, to quantify the main factors influencing losses and make recommendations on ways to reduce these losses. This report is a synthesis of the main conclusions and recommendations from the results of the studies. The final reports from the individual sub-projects in this project are available from the EPA (www.epa.ie).Environmental Protection Agenc

    Comparison of proteomic profiles of serum, plasma, and modified media supplements used for cell culture and expansion

    Get PDF
    BACKGROUND: The culture and expansion of human cells for clinical use requires the presence of human serum or plasma in culture media. Although these supplements have been extensively characterized in their chemical composition, only recently it has been possible to provide by high throughput protein analysis, a comprehensive profile of the soluble factors contributing to cell survival. This study analyzed and compared the presence of 100 proteins including chemokines, cytokines and soluble factors in six different types of media supplements: serum, plasma, recalcified plasma, heat inactivated serum, heat inactivated plasma and heat inactivated recalcified plasma. METHODS: Serum, plasma, recalcified plasma, and heat inactivated supplements were prepared from ten healthy subjects. The levels of 100 soluble factors were measured in each sample using a multiplexed ELISA assay and compared by Eisen hierarchical clustering analysis. RESULTS: A comparison of serum and plasma levels of soluble factors found that 2 were greater in plasma but 18 factors were greater in serum including 11 chemokines. The levels of only four factors differed between recalcified plasma and plasma. Heat inactivation had the greatest effect on soluble factors. Supervised Eisen hierarchical clustering indicated that the differences between heat inactivated supplements and those that were not were greater than the differences within these two groups. The levels of 36 factors differed between heat inactivated plasma and plasma. Thirty one of these factors had a lower concentration in heat inactivated plasma including 12 chemokines, 4 growth factors, 4 matrix metalloproteases, and 3 adhesion molecules. Heat inactivated decalcified plasma is often used in place of heat inactivated serum and the levels of 19 soluble factors differed between these two supplements. CONCLUSION: Our report provides a comprehensive protein profile of serum, plasma recalcified plasma, and heat inactivated supplements. This profile represents a qualitative and quantitative database that can aid in the selection of the appropriate blood derived supplement for human cell cultures with special requirements

    Patient Surgical Outcomes When Surgery Residents Are the Primary Surgeon by Intensity of Surgical Attending Supervision in Veterans Affairs Medical Centers

    Get PDF
    OBJECTIVE: Using health records from the Department of Veterans Affairs (VA), the largest healthcare training platform in the United States, we estimated independent associations between the intensity of attending supervision of surgical residents and 30-day postoperation patient outcomes. BACKGROUND: Academic leaders do not agree on the level of autonomy from supervision to grant surgery residents to best prepare them to enter independent practice without risking patient outcomes. METHODS: Secondary data came from a national, systematic 1:8 sample of n = 862,425 teaching encounters where residents were listed as primary surgeon at 122 VA medical centers from July 1, 2004, through September 30, 2019. Independent associations between whether attendings had scrubbed or not scrubbed on patient 30-day all-cause mortality, complications, and 30-day readmission were estimated using generalized linear-mixed models. Estimates were tested for any residual confounding biases, robustness to different regression models, stability over time, and validated using moderator and secondary factors analyses. RESULTS: After accounting for potential confounding factors, residents supervised by scrubbed attendings in 733,997 nonemergency surgery encounters had fewer deaths within 30 days of the operation by 14.2% [0.3%, 29.9%], fewer case complications by 7.9% [2.0%, 14.0%], and fewer readmissions by 17.5% [11.2%, 24.2%] than had attendings not scrubbed. Over the 15 study years, scrubbed surgery attendings may have averted an estimated 13,700 deaths, 43,600 cases with complications, and 73,800 readmissions. CONCLUSIONS: VA policies on attending surgeon supervision have protected patient safety while allowing residents in selected teaching encounters to have limited autonomy from supervision

    Current communication practices between obstetrics and gynecology residency applicants and program directors

    Get PDF
    Importance: In order to equitably improve the residency application process, it is essential to understand the problems we need to address. Objective: To determine how obstetrics and gynecology (OBGYN) applicants and faculty communicate applicants\u27 interest to residency programs, and how program directors report being influenced by these communications. Design, Setting, and Participants: This survey study was conducted with email surveys of OBGYN application stakeholders in 2022. Included participants were OBGYN applicants, clerkship directors, and residency program directors in medical education associations\u27 email listservs. Exposures: Surveys sent by the American Association of Medical Colleges, Association of Professors of Gynecology and Obstetrics, and Council on Resident Education in Obstetrics and Gynecology. Main Outcomes and Measures: Whether applicants themselves, or faculty on their behalf, communicated to residency programs, and the influence program directors reported placing on these communications for their decision-making. Descriptive statistics and χ2 tests were used to analyze differences. Results: A total 726 of 2781 applicants (26.1%) responded to the survey and were included in analysis (79 of 249 [31.7%] clerkship directors; 200 of 280 [71.4%] program directors). The self-reported racial and ethnic demographics of the 726 applicant respondents were 86 Asian (11.8%), 54 Black (7.4%), 41 Latinx (5.6%), 1 Native Hawaiian or Pacific Islander (0.1%), 369 White (52.2%), 45 with multiple racial identities (6.2%), and 91 (21.5%) preferring not to answer. The majority of applicants (590 [82.9%]) sent communications at some point in the application process. Applicants who identified as White (336 [88.7%]) or Asian (75 [87.2%]) were more likely than those who identified as Black (40 [74.1%]) or Latinx (33 [80.5%]) to reach out to programs (P = .02). There were also differences in type of medical school, with 377 of 427 MD applicants (88.3%), 109 of 125 DO applicants (87.2%), and 67 of 87 International Medical Graduate applicants (77.7%) reporting sending communications (P = .02). Approximately one-third (254 applicants [35.7%]) had faculty reach out to programs on their behalf. White (152 [40.1%]) and Asian (37 [43.0%]) applicants were more likely to have faculty reach out compared with Black (6 [11.1%]) and Latinx (12 [29.3%]) applicants (P = .01). Program directors reported that preinterview communications from faculty they knew (64 [32.2%]) and other program directors (25 [12.6%]) strongly influenced their decisions, and otherwise rarely reported that communications strongly influenced their decisions. Conclusions and Relevance: The current state of communications may increase inequities in residency application processes; differences between faculty communications for applicants from different racial and ethnic backgrounds are particularly concerning given that program directors are more likely to weigh communications from faculty in their decision-making. A centralized, equitable means for applicants to signal their interest to programs is urgently needed

    Environmental behaviour of iron and steel slags in coastal settings

    Get PDF
    Iron and steel slags have a long history of both disposal and beneficial use in the coastal zone. Despite the large volumes of slag deposited, comprehensive assessments of potential risks associated with metal(loid) leaching from iron and steel by-products are rare for coastal systems. This study provides a national-scale overview of the 14 known slag deposits in the coastal environment of Great Britain (those within 100 m of the mean high-water mark), comprising geochemical characterisation and leaching test data (using both low and high ionic strength waters) to assess potential leaching risks. The seaward facing length of slag deposits totalled at least 76 km, and are predominantly composed of blast furnace (ironmaking) slags from the early to mid-20th Century. Some of these form tidal barriers and formal coastal defence structures, but larger deposits are associated with historical coastal disposal in many former areas of iron and steel production, notably the Cumbrian coast of England. Slag deposits are dominated by melilite phases (e.g. gehlenite), with evidence of secondary mineral formation (e.g. gypsum, calcite) indicative of weathering. Leaching tests typically show lower element (e.g. Ba, V, Cr, Fe) release under seawater leaching scenarios compared to deionised water, largely ascribable to the pH bufering provided by the former. Only Mn and Mo showed elevated leaching concentrations in seawater treatments, though at modest levels (<3 mg/L and 0.01 mg/L, respectively). No signifcant leaching of potentially ecotoxic elements such as Cr and V (mean leachate concentrations <0.006 mg/L for both) were apparent in seawater, which micro-X-Ray Absorption Near Edge Structure (μXANES) analysis show are both present in slags in low valence (and low toxicity) forms. Although there may be physical hazards posed by extensive erosion of deposits in high-energy coastlines, the data suggest seawater leaching of coastal iron and steel slags in the UK is likely to pose minimal environmental risk

    Enteric bacterial pathogen detection in southern sea otters (Enhydra lutris nereis) is associated with coastal urbanization and freshwater runoff

    Get PDF
    Although protected for nearly a century, California’s sea otters have been slow to recover, in part due to exposure to fecally-associated protozoal pathogens like Toxoplasma gondii and Sarcocystis neurona. However, potential impacts from exposure to fecal bacteria have not been systematically explored. Using selective media, we examined feces from live and dead sea otters from California for specific enteric bacterial pathogens (Campylobacter, Salmonella, Clostridium perfringens, C. difficile and Escherichia coli O157:H7), and pathogens endemic to the marine environment (Vibrio cholerae, V. parahaemolyticus and Plesiomonas shigelloides). We evaluated statistical associations between detection of these pathogens in otter feces and demographic or environmental risk factors for otter exposure, and found that dead otters were more likely to test positive for C. perfringens, Campylobacter and V. parahaemolyticus than were live otters. Otters from more urbanized coastlines and areas with high freshwater runoff (near outflows of rivers or streams) were more likely to test positive for one or more of these bacterial pathogens. Other risk factors for bacterial detection in otters included male gender and fecal samples collected during the rainy season when surface runoff is maximal. Similar risk factors were reported in prior studies of pathogen exposure for California otters and their invertebrate prey, suggesting that land-sea transfer and/or facilitation of pathogen survival in degraded coastal marine habitat may be impacting sea otter recovery. Because otters and humans share many of the same foods, our findings may also have implications for human health

    Urinary biomarker concentrations of captan, chlormequat, chlorpyrifos and cypermethrin in UK adults and children living near agricultural land

    Get PDF
    There is limited information on the exposure to pesticides experienced by UK residents living near agricultural land. This study aimed to investigate their pesticide exposure in relation to spray events. Farmers treating crops with captan, chlormequat, chlorpyrifos or cypermethrin provided spray event information. Adults and children residing ≤100 m from sprayed fields provided first-morning void urine samples during and outwith the spray season. Selected samples (1–2 days after a spray event and at other times (background samples)) were analysed and creatinine adjusted. Generalised Linear Mixed Models were used to investigate if urinary biomarkers of these pesticides were elevated after spray events. The final data set for statistical analysis contained 1518 urine samples from 140 participants, consisting of 523 spray event and 995 background samples which were analysed for pesticide urinary biomarkers. For captan and cypermethrin, the proportion of values below the limit of detection was greater than 80%, with no difference between spray event and background samples. For chlormequat and chlorpyrifos, the geometric mean urinary biomarker concentrations following spray events were 15.4 μg/g creatinine and 2.5 μg/g creatinine, respectively, compared with 16.5 μg/g creatinine and 3.0 μg/g creatinine for background samples within the spraying season. Outwith the spraying season, concentrations for chlorpyrifos were the same as those within spraying season backgrounds, but for chlormequat, lower concentrations were observed outwith the spraying season (12.3 μg/g creatinine). Overall, we observed no evidence indicative of additional urinary pesticide biomarker excretion as a result of spray events, suggesting that sources other than local spraying are responsible for the relatively low urinary pesticide biomarkers detected in the study population

    Advancing DNA barcoding and metabarcoding applications for plants requires systematic analysis of herbarium collections-an Australian perspective

    Get PDF
    Building DNA barcode databases for plants has historically been ad hoc, and often with a relatively narrow taxonomic focus. To realize the full potential of DNA barcoding for plants, and particularly its application to metabarcoding for mixed-species environmental samples, systematic sequencing of reference collections is required using an augmented set of DNA barcode loci, applied according to agreed data generation and analysis standards. The largest and most complete reference collections of plants are held in herbaria. Australia has a globally significant flora that is well sampled and expertly curated by its herbaria, coordinated through the Council of Heads of Australasian Herbaria. There exists a tremendous opportunity to provide a comprehensive and taxonomically robust reference database for plant DNA barcoding applications by undertaking coordinated and systematic sequencing of the entire flora of Australia utilizing existing herbarium material. In this paper, we review the development of DNA barcoding and metabarcoding and consider the requirements for a robust and comprehensive system. We analyzed the current availability of DNA barcode reference data for Australian plants, recommend priority taxa for database inclusion, and highlight future applications of a comprehensive metabarcoding system. We urge that large-scale and coordinated analysis of herbarium collections be undertaken to realize the promise of DNA barcoding and metabarcoding, and propose that the generation and curation of reference data should become a national investment priority
    corecore