734 research outputs found

    Selection of Bread Wheat for Low Grain Cadmium Concentration at the Seedling Stage Using Hydroponics versus Molecular Markers

    Get PDF
    The excessive accumulation of Cd in harvested crops grown on high-Cd soils has increased public concerns for food safety. Due to the high consumption of bread wheat (Triticum aestivum L.) per capita, high concentrations of Cd in wheat grain can significantly affect human health. Breeding is a promising way to reduce grain Cd concentration. However, a lack of efficient selection methods impedes breeding for low grain Cd concentration in bread wheat. In this study, a recombinant inbred population segregating for grain Cd concentration was used to assess the efficacy of two selection methods for decreasing grain Cd concentration in bread wheat: a hydroponic selection method used shoot Cd concentration in 2-wk-old seedlings growing in Cd-containing medium, and a marker-based selection method using markers linked to heavy metal transporting P1B-ATPase 3 (HMA3), the gene underlying Cdu1. Both methods effectively selected low-Cd lines. The HMA3-linked marker-based selection was superior to hydroponic selection in terms of both simplicity and response to selection. The HMA3-linked markers explained 20% of the phenotypic variation in grain Cd concentration with an additive effect of 0.014 mg kg−1. The hydroponic selection and marker-based selection may target two different and independent processes controlling grain Cd accumulation, and they had no effect on grain Zn and Fe concentrations. The ALMT1-UPS4 marker associated with Al tolerance was not associated with grain Cd concentration but increased grain Zn and Fe concentrations. The 193-bp allele of the Rht8-associated marker, GWM261, was associated with increased grain Cd concentration

    Use of the Michigan Neuropathy Screening Instrument as a measure of distal symmetrical peripheral neuropathy in Type 1 diabetes: results from the Diabetes Control and Complications Trial/Epidemiology of Diabetes Interventions and Complications

    Full text link
    Aims  The Michigan Neuropathy Screening Instrument (MNSI) is used to assess distal symmetrical peripheral neuropathy in diabetes. It includes two separate assessments: a 15‐item self‐administered questionnaire and a lower extremity examination that includes inspection and assessment of vibratory sensation and ankle reflexes. The purpose of this study was to evaluate the performance of the MNSI in detecting distal symmetrical peripheral neuropathy in patients with Type 1 diabetes and to develop new scoring algorithms. Methods  The MNSI was performed by trained personnel at each of the 28 Diabetes Control and Complications Trial/Epidemiology of Diabetes Interventions and Complications clinical sites. Neurologic examinations and nerve conduction studies were performed during the same year. Confirmed clinical neuropathy was defined by symptoms and signs of distal symmetrical peripheral neuropathy based on the examination of a neurologist and abnormal nerve conduction findings in ≥ 2 anatomically distinct nerves among the sural, peroneal and median nerves. Results  We studied 1184 subjects with Type 1 diabetes. Mean age was 47 years and duration of diabetes was 26 years. Thirty per cent of participants had confirmed clinical neuropathy, 18% had ≥ 4 and 5% had ≥ 7 abnormal responses on the MNSI questionnaire, and 33% had abnormal scores (≥ 2.5) on the MNSI examination. New scoring algorithms were developed and cut points defined to improve the performance of the MNSI questionnaire, examination and the combination of the two. Conclusions  Altering the cut point to define an abnormal test from ≥ 7 abnormal to ≥ 4 abnormal items improves the performance of the MNSI questionnaire. The MNSI is a simple, non‐invasive and valid measure of distal symmetrical peripheral neuropathy in Type 1 diabetes.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/92152/1/j.1464-5491.2012.03644.x.pd

    Diet of children under the government-funded meal support program in Korea

    Get PDF
    The purpose of this study was to investigate the diet of children under the government-funded meal support program. The 143 children (67 boys and 76 girls) participated in this study among 4th-6th elementary school students receiving free lunches during the summer vacation of 2007 and living in Gwanak-gu, Seoul, Korea. The subjects consisted of four groups supported by Meal Box Delivery (n = 26), Institutional Foodservice (n = 53), Restaurant Foodservice (n = 27), or Food Delivery (n = 37). A three-day 24-hour dietary recall and a self-administered survey were conducted. In addition, the children's heights and weights were measured. The average energy intake of the children was 1,400 kcal per day, much lower than the Estimated Energy Requirements of the pertinent age groups. The results also showed inadequate intake of all examined nutrients; of particular concern was the extremely low intake of calcium. On average, the children consumed eight dishes and 25 food items per day. The children supported by Meal Box Delivery consumed more various dishes and food items than the other groups. The percentage of children preferring their current meal support method was the highest in those supported by Meal Box Delivery and the lowest in those supported by Food Delivery. We requested 15 children among the 143 children participating in the survey to draw the scene of their lunch time. The drawings of the children supported by Institutional Foodservice showed more positive scenes than the other groups, especially in terms of human aspects. In conclusion, the overall diet of children under the government-funded meal support program was nutritionally inadequate, although the magnitude of the problems tended to differ by the meal support method. The results could be utilized as basic data for policy and programs regarding the government-funded meal support program for children from low-income families

    Rural to Urban Migration and Changes in Cardiovascular risk Factors in Tanzania: A Prospective Cohort Study.

    Get PDF
    High levels of rural to urban migration are a feature of most African countries. Our aim was to investigate changes, and their determinants, in cardiovascular risk factors on rural to urban migration in Tanzania. Men and women (15 to 59 years) intending to migrate from Morogoro rural region to Dar es Salaam for at least 6 months were identified. Measurements were made at least one week but no more than one month prior to migration, and 1 to 3 monthly after migration. Outcome measures included body mass index, blood pressure, fasting lipids, and self reported physical activity and diet. One hundred and three men, 106 women, mean age 29 years, were recruited and 132 (63.2%) followed to 12 months. All the figures presented here refer to the difference between baseline and 12 months in these 132 individuals. Vigorous physical activity declined (79.4% to 26.5% in men, 37.8% to 15.6% in women, p < 0.001), and weight increased (2.30 kg men, 2.35 kg women, p < 0.001). Intake of red meat increased, but so did the intake of fresh fruit and vegetables. HDL cholesterol increased in men and women (0.24, 0.25 mmoll-1 respectively, p < 0.001); and in men, not women, total cholesterol increased (0.42 mmoll-1, p = 0.01), and triglycerides fell (0.31 mmoll-1, p = 0.034). Blood pressure appeared to fall in both men and women. For example, in men systolic blood pressure fell by 5.4 mmHg, p = 0.007, and in women by 8.6 mmHg, p = 0.001. The lower level of physical activity and increasing weight will increase the risk of diabetes and cardiovascular disease. However, changes in diet were mixed, and may have contributed to mixed changes in lipid profiles and a lack of rise in blood pressure. A better understanding of the changes occurring on rural to urban migration is needed to guide preventive measures

    Risk factors for acute chemical releases with public health consequences: Hazardous Substances Emergency Events Surveillance in the U.S., 1996–2001

    Get PDF
    BACKGROUND: Releases of hazardous materials can cause substantial morbidity and mortality. To reduce and prevent the public health consequences (victims or evacuations) from uncontrolled or illegally released hazardous substances, a more comprehensive analysis is needed to determine risk factors for hazardous materials incidents. METHODS: Hazardous Substances Emergency Events Surveillance (HSEES) data from 1996 through 2001 were analyzed using bivariate and multiple logistic regression. Fixed-facility and transportation-related events were analyzed separately. RESULTS: For fixed-facility events, 2,327 (8%) resulted in at least one victim and 2,844 (10%) involved ordered evacuations. For transportation-related events, 759 (8%) resulted in at least one victim, and 405 (4%) caused evacuation orders. Fire and/or explosion were the strongest risk factors for events involving either victims or evacuations. Stratified analysis of fixed-facility events involving victims showed a strong association for acid releases in the agriculture, forestry, and fisheries industry. Chlorine releases in fixed-facility events resulted in victims and evacuations in more industry categories than any other substance. CONCLUSIONS: Outreach efforts should focus on preventing and preparing for fires and explosions, acid releases in the agricultural industry, and chlorine releases in fixed facilities

    Usual choline and betaine dietary intake and incident coronary heart disease: the Atherosclerosis Risk in Communities (ARIC) Study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Low dietary intake of the essential nutrient choline and its metabolite betaine may increase atherogenesis both through effects on homocysteine methylation pathways as well as through choline's antioxidants properties. Nutrient values for many common foods for choline and betaine have recently become available in the U.S. nutrient composition database. Our objective was to assess the association of dietary intake of choline and betaine with incident coronary heart disease (CHD), adjusting for dietary intake measurement error.</p> <p>Methods</p> <p>We conducted a prospective investigation of the relation between usual intake of choline and betaine with the risk of CHD in 14,430 middle-aged men and women of the biethnic Atherosclerosis Risk in Communities study. A semi-quantitative food frequency questionnaire was used to assess nutrient intake. Proportional hazard regression models were used to calculate the risk of incident CHD. A regression calibration method was used to adjust for measurement error.</p> <p>Results</p> <p>During an average 14 years of follow-up (1987–2002), 1,072 incident CHD events were documented. Compared with the lowest quartile of intake, incident CHD risk was slightly and non-significantly higher in the highest quartile of choline and choline plus betaine, HR = 1.22 (0.91, 1.64) and HR = 1.14 (0.85, 1.53), controlling for age, sex, education, total energy intake, dietary intakes of folate, methionine and vitamin B<sub>6</sub>. No association was found between dietary choline intake and incident CHD when correcting for measurement error.</p> <p>Conclusion</p> <p>Higher intakes of choline and betaine were not protective for incident CHD. Similar investigations in other populations are of interest.</p

    Cardiovascular risk among Aboriginal and non-Aboriginal smoking male prisoners: inequalities compared to the wider community

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Cardiovascular risk factors (CVRF) were collected as part of a randomised controlled trial of a multi-component intervention to reduce smoking among male prisoners. Cross-sectional baseline data on CVRF were compared among smoking male prisoners and males of similar age in the general population.</p> <p>Methods</p> <p>425 smoking prisoners were recruited (n = 407 in New South Wales; 18 in Queensland), including 15% of Aboriginal descent (mean age 33 years; median sentence length 3.6 years). We measured CVRF such as smoking, physical activity, blood pressure, risky alcohol use, symptoms of depression, and low socioeconomic status.</p> <p>Results</p> <p>We found that 39% of prisoners had 3+ CVRF, compared to 10% in a general community sample of most disadvantaged men of a similar age. Significantly more Aboriginal prisoners had 3+ CVRF than non-Aboriginal prisoners (55% vs 36%, p < 0.01) and were twice as likely to have 4+ CVRF (27% vs 12%). In addition to all prisoners in this study being a current smoker (with 70% smoking 20+ cigarettes per day), the prevalence of other CVRF was very high: insufficient physical activity (23%); hypertension (4%), risky drinking (52%), symptoms of depression (14%) and low socioeconomic status (SES) (44%). Aboriginal prisoners had higher levels of risky alcohol use, symptoms of depression, and were more likely to be of low SES.</p> <p>Conclusion</p> <p>Prisoners are at high risk for developing cardiovascular disease compared to even the most disadvantaged in their community and should be the focus of specific public health interventions.</p> <p>Trial Registration</p> <p>This trial is registered with the Australian New Zealand Clinical Trials Registry <a href="http://www.anzctr.org.au/ACTRN12606000229572.aspx">ACTRN#12606000229572</a>.</p

    Assessment of Cultivar Distinctness in Alfalfa: A Comparison of Genotyping‐by‐Sequencing, Simple‐Sequence Repeat Marker, and Morphophysiological Observations

    Get PDF
    Cultivar registration agencies typically require morphophysiological trait-based distinctness of candidate cultivars. This requirement is difficult to achieve for cultivars of major perennial forages because of their genetic structure and ever-increasing number of registered material, leading to possible rejection of agronomically valuable cultivars. This study aimed to explore the value of molecular markers applied to replicated bulked plants (three bulks of 100 independent plants each per cultivar) to assess alfalfa ( L. subsp. ) cultivar distinctness. We compared genotyping-by-sequencing information based on 2902 polymorphic single-nucleotide polymorphism (SNP) markers (>30 reads per DNA sample) with morphophysiological information based on 11 traits and with simple-sequence repeat (SSR) marker information from 41 polymorphic markers for their ability to distinguish 11 alfalfa landraces representative of the germplasm from northern Italy. Three molecular criteria, one based on cultivar differences for individual SSR bands and two based on overall SNP marker variation assessed either by statistically significant cultivar differences on principal component axes or discriminant analysis, distinctly outperformed the morphophysiological criterion. Combining the morphophysiological criterion with either molecular marker method increased discrimination among cultivars, since morphophysiological diversity was unrelated to SSR marker-based diversity ( = 0.04) and poorly related to SNP marker-based diversity ( = 0.23, < 0.15). The criterion based on statistically significant SNP allele frequency differences was less discriminating than morphophysiological variation. Marker-based distinctness, which can be assessed at low cost and without interactions with testing conditions, could validly substitute for (or complement) morphophysiological distinctness in alfalfa cultivar registration schemes. It also has interest in sui generis registration systems aimed at marketing alfalfa landraces

    The contribution of demographic and morbidity factors to self-reported visit frequency of patients: a cross-sectional study of general practice patients in Australia

    Get PDF
    BACKGROUND: Understanding the factors that affect patients' utilisation of health services is important for health service provision and effective patient management. This study aimed to investigate the specific morbidity and demographic factors related to the frequency with which general practice patients visit a general practitioner/family physician (GP) in Australia. METHODS: A sub-study was undertaken as part of an ongoing national study of general practice activity in Australia. A cluster sample of 10,755 general practice patients were surveyed through a random sample of 379 general practitioners. The patient reported the number of times he/she had visited a general practitioner in the previous twelve months. The GP recorded all the patient's major health problems, including those managed at the current consultation. RESULTS: Patients reported an average of 8.8 visits to a general practitioner per year. After adjusting for other patient demographics and number of health problems, concession health care card holders made on average 2.6 more visits per year to a general practitioner than did non-card holders (p < .001). After adjustment, patients from remote/very remote locations made 2.3 fewer visits per year than patients from locations where services were highly accessible (p < .001). After adjustment for patient demographics, patients with diagnosed anxiety made on average 2.7 more visits per year (p = 0.003), those with diagnosed depression 2.2 more visits than average (p < .0001), and those with back problems 2.4 more visits (p = 0.009) than patients without the respective disorders. CONCLUSIONS: Anxiety, back pain and depression are associated with greater patient demand for general practice services than other health problems. The effect of sociodemographic factors on patient utilisation of general practice services is complex. Equity of access to general practice services remains an issue for patients from remote areas, while concession health care card holders are attending general practice more frequently than other patients relative to their number of health problems
    corecore