3,286 research outputs found

    Protein and amino acid intakes in relation to prostate cancer risk and mortality—A prospective study in the European Prospective Investigation into Cancer and Nutrition

    Get PDF
    This work was supported by Cancer Research UK (C8221/A29017). The coordination of the European Prospective Investigation into Cancer and Nutrition (EPIC) is financially supported by the International Agency for Research on Cancer and has been supported by the European Commission (DG-SANCO). The na- tional cohorts are supported by Danish Cancer Society (Denmark); German Cancer Aid, German Cancer Research Center (DKFZ), Federal Ministry of Education and Research (BMBF) (Germany); Associazione Italiana per la Ricerca sul Cancro-AIRC-Italy and National Research Council (Italy); Dutch Ministry of Public Health, Welfare and Sports (VWS), Netherlands Cancer Registry (NKR), LK Research Funds, Dutch Prevention Funds, Dutch ZON (Zorg Onderzoek Nederland), World Cancer Research Fund (WCRF), Statistics Netherlands (The Netherlands); Health Research Fund (FIS) - Instituto de Salud Carlos III (ISCIII), Regional Governments of Andalucía, Asturias, Basque Country, Murcia and Navarra, and the Catalan Institute of Oncology - ICO (Spain); Swedish Cancer Society, Swedish Scientific Council, and Regional Government of Skåne and Västerbotten (Sweden); Cancer Research UK (14136 to EPIC-Norfolk; C570/A16491 to EPIC- Oxford), Medical Research Council (1000143 to EPIC- Norfolk, MR/M012190/1 to EPIC-Oxford) (UK).Background: The association between protein intake and prostate cancer risk remains unclear. Aims: To prospectively investigate the associations of dietary intakes of total protein, protein from different dietary sources, and amino acids with prostate cancer risk and mortality. Methods: In 131,425 men from the European Prospective Investigation into Cancer and Nutrition, protein and amino acid intakes were estimated using validated dietary questionnaires. Multivariable-adjusted Cox regression models were used to estimate hazard ratios (HRs) and 95% confidence intervals (CIs). Results: During a mean follow-up of 14.2 years, 6939 men were diagnosed with prostate cancer and 914 died of the disease. Dairy protein was positively associated with overall prostate cancer risk in the three highest fifths compared to the lowest (HRQ3 =1.14 (95% CI 1.05-1.23); HRQ 4=1.09 (1.01-1.18); HRQ5 =1.10 (1.02-1.19)); similar results were observed for yogurt protein (HRQ3 =1.14 (1.05-1.24); HRQ4 =1.09 (1.01-1.18); HRQ5 =1.12 (1.04-1.21)). For egg protein intake and prostate cancer mortality, no association was observed by fifths, but there was suggestive evidence of a positive association in the analysis per standard deviation increment. There was no strong evidence of associations with different tumour subtypes. Discussion: Considering the weak associations and many tests, the results must be interpreted with caution. Conclusion: This study does not provide strong evidence for an association of intakes of total protein, protein from different dietary sources or amino acids with prostate cancer risk or mortality. However, our results may suggest some weak positive associations, which need to be confirmed in large-scale, pooled analyses of prospective data.International Agency for Research on Cancer, European Commission (DG-SANCO)Cancer Research UK C8221/A29017Health Research Fund (FIS) - Instituto de Salud Carlos III (ISCIII)Regional Governments of AndalucíaRegional Governments of AsturiasRegional Governments of Basque CountryRegional Governments of MurciaRegional Governments of NavarraCatalan Institute of Oncology - ICO (Spain

    The role of the school counselor and Internet predators

    Get PDF
    Children and adolescents are vulnerable in person and now have become vulnerable through technology. The Internet is becoming larger, and so are the opportunities for predators to contact children. Using the Internet and online chat rooms, sexual predators begin to groom their victims and move the relationship forward. Children become involved in sexual photographs, videos, and telephone conversations. Eventually, a meeting is set up between the child and the predator. Many people need to be provided information to keep children safe, and the issue needs to be addressed at all governmental and educational levels. School counselors can take charge and provide children and adolescents with individual, group, and classroom counseling for students in dealing with Internet safety issues

    Dietary supplementation with pollen enhances survival and Collembola boosts fitness of a web-building spider

    Get PDF
    Uncertainties exist about the value of non-prey food for predators that are commonly food-limited, and the dietary conditions where non-prey foods are beneficial for carnivorous species. Prior studies show that large quantities of pollen grains are intercepted in the webs of web-building spiders. We examined the nutritional benefits of pollen as a non-prey food for a common ground-dwelling, sheet web-building spider, Mermessus fradeorum (Berland) (Araneae: Linyphiidae). These predators were provided diets of prey or no prey in the presence and absence of pollen. Treatment effects were quantified by measuring predator body nutrient composition, survival, body size, and offspring production. Per unit dry weight, pollen had less nitrogen and lipids than prey, although relative quantities of these nutrients per meal were not measured. Dietary treatments altered the body tissue composition of the spiders, leading to the highest N content and lipid reserves in spiders provided with Collembola. Supplementing diets with pollen increased both juvenile and adult survival, and the greatest survivorship and offspring production was observed when spiders were provided diets of Collembola supplemented with pollen. Our results show that Collembola are high-quality prey for spiders and pollen has positive effects on nutritional status and survival of a carnivorous species. Foraging on plant material potentially promotes population growth at early and late developmental stages by supplementing diets of poor-quality prey, and preventing starvation when prey are scarce

    A Multi-Factorial Risk Prioritization Framework for Food-Borne Pathogens

    Get PDF
    To lower the incidence of human food-borne disease, experts and stakeholders have urged the development of a science- and risk-based management system in which food-borne hazards are analyzed and prioritized. A literature review shows that most approaches to risk prioritization developed to date are based on measures of health outcomes and do not systematically account for other factors that may be important to decision making. The Multi-Factorial Risk Prioritization Framework developed here considers four factors that may be important to risk managers: public health, consumer risk perceptions and acceptance, market-level impacts, and social sensitivity. The framework is based on the systematic organization and analysis of data on these multiple factors. The basic building block of the information structure is a three-dimensional cube based on pathogen-food-factor relationships. Each cell of the cube has an information card associated with it and data from the cube can be aggregated along different dimensions. The framework is operationalized in three stages, with each stage adding another dimension to decision-making capacity. The first stage is the information cards themselves that provide systematic information that is not pre-processed or aggregated across factors. The second stage maps the information on the various information cards into cobweb diagrams that create a graphical profile of, for example, a food-pathogen combination with respect to each of the four risk prioritization factors. The third stage is formal multi-criteria decision analysis in which decision makers place explicit values on different criteria in order to develop risk priorities. The process outlined above produces a ‘List A’ of priority food-pathogen combinations according to some aggregate of the four risk prioritization factors. This list is further vetted to produce ‘List B’, which brings in feasibility analysis by ranking those combinations where practical actions that have a significant impact are feasible. Food-pathogen combinations where not enough is known to identify any or few feasible interventions are included in ‘List C’. ‘List C’ highlights areas with significant uncertainty where further research may be needed to enhance the precision of the risk prioritization process. The separation of feasibility and uncertainty issues through the use of ‘Lists A, B, and C’ allows risk managers to focus separately on distinct dimensions of the overall prioritization. The Multi-Factorial Risk Prioritization Framework provides a flexible instrument that compares and contrasts risks along four dimensions. Use of the framework is an iterative process. It can be used to establish priorities across pathogens for a particular food, across foods for a particular pathogen and/or across specific food-pathogen combinations. This report provides a comprehensive conceptual paper that forms the basis for a wider process of consultation and for case studies applying the framework.risk analysis, risk prioritization, food-borne pathogens, benefits and costs

    A Comparison of the Use of the Antisocial and Borderline Personality Disorder Scales in the MCMI-III and Personality Assessment Inventory with a Criminal Justice Population

    Get PDF
    The present study compared outcome measurements on the Antisocial and Borderline scales of the Personality Assessment Inventory (PAI) with those on the Milion Clinical Multiaxial Inventory (MCMI-III) when both were used with a criminal justice population. Significant positive correlations were found between the Antisocial scales on the PAI and MCMI-III, as well as between the Borderline scales of both assessments, indicating that in an evaluation process it would be sufficient to use only one assessment. It is suggested that the MCMI-III is the better option to save costs and time while preserving the clinical accuracy of the testing protocol for use with a criminal justice population to make appropriate treatment recommendations

    Spectroscopic confirmation of an ultra-faint galaxy at the epoch of reionization

    Get PDF
    Within one billion years of the Big Bang, intergalactic hydrogen was ionized by sources emitting ultraviolet and higher energy photons. This was the final phenomenon to globally affect all the baryons (visible matter) in the Universe. It is referred to as cosmic reionization and is an integral component of cosmology. It is broadly expected that intrinsically faint galaxies were the primary ionizing sources due to their abundance in this epoch. However, at the highest redshifts (z>7.5z>7.5; lookback time 13.1 Gyr), all galaxies with spectroscopic confirmations to date are intrinsically bright and, therefore, not necessarily representative of the general population. Here, we report the unequivocal spectroscopic detection of a low luminosity galaxy at z>7.5z>7.5. We detected the Lyman-α\alpha emission line at 10504\sim 10504 {\AA} in two separate observations with MOSFIRE on the Keck I Telescope and independently with the Hubble Space Telescope's slit-less grism spectrograph, implying a source redshift of z=7.640±0.001z = 7.640 \pm 0.001. The galaxy is gravitationally magnified by the massive galaxy cluster MACS J1423.8+2404 (z=0.545z = 0.545), with an estimated intrinsic luminosity of MAB=19.6±0.2M_{AB} = -19.6 \pm 0.2 mag and a stellar mass of M=3.00.8+1.5×108M_{\star} = 3.0^{+1.5}_{-0.8} \times 10^8 solar masses. Both are an order of magnitude lower than the four other Lyman-α\alpha emitters currently known at z>7.5z > 7.5, making it probably the most distant representative source of reionization found to date

    The Putative Cerean Exosphere

    Get PDF
    The ice-rich crust of dwarf planet 1 Ceres is the source of a tenuous water exosphere, and the behavior of thisputative exosphere is investigated with model calculations. Outgassing water molecules seasonally condensearound the winter pole in an optically thin layer

    Cloud System Evolution in the Trades (CSET): Following the Evolution of Boundary Layer Cloud Systems with the NSFNCAR GV

    Get PDF
    The Cloud System Evolution in the Trades (CSET) study was designed to describe and explain the evolution of the boundary layer aerosol, cloud, and thermodynamic structures along trajectories within the North Pacific trade winds. The study centered on seven round trips of the National Science FoundationNational Center for Atmospheric Research (NSFNCAR) Gulfstream V (GV) between Sacramento, California, and Kona, Hawaii, between 7 July and 9 August 2015. The CSET observing strategy was to sample aerosol, cloud, and boundary layer properties upwind from the transition zone over the North Pacific and to resample these areas two days later. Global Forecast System forecast trajectories were used to plan the outbound flight to Hawaii with updated forecast trajectories setting the return flight plan two days later. Two key elements of the CSET observing system were the newly developed High-Performance Instrumented Airborne Platform for Environmental Research (HIAPER) Cloud Radar (HCR) and the high-spectral-resolution lidar (HSRL). Together they provided unprecedented characterizations of aerosol, cloud, and precipitation structures that were combined with in situ measurements of aerosol, cloud, precipitation, and turbulence properties. The cloud systems sampled included solid stratocumulus infused with smoke from Canadian wildfires, mesoscale cloudprecipitation complexes, and patches of shallow cumuli in very clean environments. Ultraclean layers observed frequently near the top of the boundary layer were often associated with shallow, optically thin, layered veil clouds. The extensive aerosol, cloud, drizzle, and boundary layer sampling made over open areas of the northeast Pacific along 2-day trajectories during CSET will be an invaluable resource for modeling studies of boundary layer cloud system evolution and its governing physical processes

    Supplemental Ascorbate Diminishes DNA Damage Yet Depletes Glutathione and Increases Acute Liver Failure in a Mouse Model of Hepatic Antioxidant System Disruption

    Get PDF
    Cellular oxidants are primarily managed by the thioredoxin reductase-1 (TrxR1)- and glutathione reductase (Gsr)-driven antioxidant systems. In mice having hepatocyte-specific codisruption of TrxR1 and Gsr (TrxR1/Gsr-null livers), methionine catabolism sustains hepatic levels of reduced glutathione (GSH). Although most mice with TrxR1/Gsr-null livers exhibit long-term survival, ~25% die from spontaneous liver failure between 4- and 7-weeks of age. Here we tested whether liver failure was ameliorated by ascorbate supplementation. Following ascorbate, dehydroascorbate, or mock treatment, we assessed survival, liver histology, or hepatic redox markers including GSH and GSSG, redox enzyme activities, and oxidative damage markers. Unexpectedly, rather than providing protection, ascorbate (5 mg/mL, drinking water) increased the death-rate to 43%. In adults, ascorbate (4 mg/g × 3 days i.p.) caused hepatocyte necrosis and loss of hepatic GSH in TrxR1/Gsr-null livers but not in wildtype controls. Dehydroascorbate (0.3 mg/g i.p.) also depleted hepatic GSH in TrxR1/Gsr-null livers, whereas GSH levels were not significantly affected by either treatment in wildtype livers. Curiously, however, despite depleting GSH, ascorbate treatment diminished basal DNA damage and oxidative stress markers in TrxR1/Gsr-null livers. This suggests that, although ascorbate supplementation can prevent oxidative damage, it also can deplete GSH and compromise already stressed livers
    corecore