5,002 research outputs found
Dietitians’ perceptions and experience of blenderised feeds for paediatric tube-feeding
Objective: There is an emerging interest in the use of blenderised food for tube-feeding (BFTF). This survey explored paediatric dietitians' perceptions and experiences of BFTF use.
Design: A web-based questionnaire was distributed to the Paediatric group of the British Dietetic Association. The survey captured dietitians' personal opinions and experience supporting children on BFTF, and the perceptions of carers.
Results: Of the 77 respondents, 19 were aware of professional guidelines and 63 had never received training on BFTF. Thirty-four would not recommend BFTF and 11 would advise against its use; yet 43 would recommend it to supplement commercial feeds. Fifty-seven would change their perception about BFTF if there were evidence-based guidelines. Forty-four would feel confident to support a patient using BFTF. Forty-three had previous experience supporting a patient with BFTF. The main concerns perceived by dietitians, pertinent to the use of BFTF, were nutritional inadequacy (n=71), tube blockages (n=64) and increased infection risk (n=59) but these were significantly higher than those experienced by themselves in clinical practice (p<0.001 for all three). A reduction in reflux and vomiting and increased carer involvement were the main perceived and observed benefits by both dietitians and carers.
Conclusions: The use of these feeds for tube-fed children is increasingly being seen as a viable choice. Dietitians experienced significantly fewer issues with the use of BFTF in clinical practice compared with their self-reported apprehensions in the survey. Well-controlled studies are now needed to objectively assess the benefits, risks, costs and practicality of BFTF
Recommended from our members
Fortification and health: challenges and opportunities.
Fortification is the process of adding nutrients or non-nutrient bioactive components to edible products (e.g., food, food constituents, or supplements). Fortification can be used to correct or prevent widespread nutrient intake shortfalls and associated deficiencies, to balance the total nutrient profile of a diet, to restore nutrients lost in processing, or to appeal to consumers looking to supplement their diet. Food fortification could be considered as a public health strategy to enhance nutrient intakes of a population. Over the past century, fortification has been effective at reducing the risk of nutrient deficiency diseases such as beriberi, goiter, pellagra, and rickets. However, the world today is very different from when fortification emerged in the 1920s. Although early fortification programs were designed to eliminate deficiency diseases, current fortification programs are based on low dietary intakes rather than a diagnosable condition. Moving forward, we must be diligent in our approach to achieving effective and responsible fortification practices and policies, including responsible marketing of fortified products. Fortification must be applied prudently, its effects monitored diligently, and the public informed effectively about its benefits through consumer education efforts. Clear lines of authority for establishing fortification guidelines should be developed and should take into account changing population demographics, changes in the food supply, and advances in technology. This article is a summary of a symposium presented at the ASN Scientific Sessions and Annual Meeting at Experimental Biology 2014 on current issues involving fortification focusing primarily on the United States and Canada and recommendations for the development of responsible fortification practices to ensure their safety and effectiveness
Estimation of Dietary Iron Bioavailability from Food Iron Intake and Iron Status
Currently there are no satisfactory methods for estimating dietary iron absorption (bioavailability) at a population level, but this is essential for deriving dietary reference values using the factorial approach. The aim of this work was to develop a novel approach for estimating dietary iron absorption using a population sample from a sub-section of the UK National Diet and Nutrition Survey (NDNS). Data were analyzed in 873 subjects from the 2000–2001 adult cohort of the NDNS, for whom both dietary intake data and hematological measures (hemoglobin and serum ferritin (SF) concentrations) were available. There were 495 men aged 19–64 y (mean age 42.7±12.1 y) and 378 pre-menopausal women (mean age 35.7±8.2 y). Individual dietary iron requirements were estimated using the Institute of Medicine calculations. A full probability approach was then applied to estimate the prevalence of dietary intakes that were insufficient to meet the needs of the men and women separately, based on their estimated daily iron intake and a series of absorption values ranging from 1–40%. The prevalence of SF concentrations below selected cut-off values (indicating that absorption was not high enough to maintain iron stores) was derived from individual SF concentrations. An estimate of dietary iron absorption required to maintain specified SF values was then calculated by matching the observed prevalence of insufficiency with the prevalence predicted for the series of absorption estimates. Mean daily dietary iron intakes were 13.5 mg for men and 9.8 mg for women. Mean calculated dietary absorption was 8% in men (50th percentile for SF 85 µg/L) and 17% in women (50th percentile for SF 38 µg/L). At a ferritin level of 45 µg/L estimated absorption was similar in men (14%) and women (13%). This new method can be used to calculate dietary iron absorption at a population level using data describing total iron intake and SF concentration
Options for basing Dietary Reference Intakes (DRIs) on chronic disease endpoints: report from a joint US-/Canadian-sponsored working group.
Dietary Reference Intakes (DRIs) are used in Canada and the United States in planning and assessing diets of apparently healthy individuals and population groups. The approaches used to establish DRIs on the basis of classical nutrient deficiencies and/or toxicities have worked well. However, it has proved to be more challenging to base DRI values on chronic disease endpoints; deviations from the traditional framework were often required, and in some cases, DRI values were not established for intakes that affected chronic disease outcomes despite evidence that supported a relation. The increasing proportions of elderly citizens, the growing prevalence of chronic diseases, and the persistently high prevalence of overweight and obesity, which predispose to chronic disease, highlight the importance of understanding the impact of nutrition on chronic disease prevention and control. A multidisciplinary working group sponsored by the Canadian and US government DRI steering committees met from November 2014 to April 2016 to identify options for addressing key scientific challenges encountered in the use of chronic disease endpoints to establish reference values. The working group focused on 3 key questions: 1) What are the important evidentiary challenges for selecting and using chronic disease endpoints in future DRI reviews, 2) what intake-response models can future DRI committees consider when using chronic disease endpoints, and 3) what are the arguments for and against continuing to include chronic disease endpoints in future DRI reviews? This report outlines the range of options identified by the working group for answering these key questions, as well as the strengths and weaknesses of each option
Insufficient voluntary intake of nutrients and energy in hospitalized patients
Aim: The aim of our study was to evaluate the inadequacy of voluntary energy and nutrient intake on the first day of hospital admission. Patients and methods: A cross-sectional study was carried out in two terciary care hospitals, with a probabilistic sample of 50% of in-patients. Dietary intake was evaluated by a 24-hour dietary recall, and undernutrition was screened through the Nutritional Risk Screening 2002 tool. The overall frequency of inadequate energy and nutrient intake was estimated using Dietary Reference Intakes. Results: Energy and nutrient intakes from 258 patients showed very low values for both men and women. No significant differences were found for energy and nutrient intakes across age groups (= 65 years). When the proportion of study subjects with inadequate nutrient intakes was analysed, a high degree of inadequacy was found. The degree of inadequacy was higher for fibre, niacin, folate, vitamin B-12, magnesium and zinc. No significant differences were found for energy and nutrients studied and for intakes below 113 of dietary recommendations from nutritionally-at-risk (n = 89) and well nourished (n = 169) patients. Conclusion: Voluntary nutrient and energy intakes in the first 24 hour of hospital admission are highly inadequate. No differences were found between undernourished and well-nourished patients or patients = 65 years
Heat or Eat? Cold Weather Shocks and Nutrition in Poor American Families
We examine the effects of cold weather periods on family budgets and on nutritional outcomes in poor American families. Expenditures on food and home fuels are tracked by linking the Consumer Expenditure Survey to temperature data. Using the Third National Health and Nutrition Examination Survey, we track calorie consumption, dietary quality, vitamin deficiencies, and anemia in summer and winter months. We find that both rich and poor families increase fuel expenditures in response to unusually cold weather (a 10 degree F drop below normal). At same time, poor families reduce food expenditures by roughly the same amount as the increase in fuel expenditures, while rich families increase food expenditures. Poor adults and children reduce caloric intake by roughly 200 calories during winter months, unlike richer adults and children. In sensitivity analyses, we find that decreases in food expenditure are most pronounced outside the South. We conclude that poor parents and their children outside the South spend and eat less food during cold weather temperature shocks. We surmise that existing social programs fail to buffer against these shocks.
Lactation and neonatal nutrition: defining and refining the critical questions.
This paper resulted from a conference entitled "Lactation and Milk: Defining and refining the critical questions" held at the University of Colorado School of Medicine from January 18-20, 2012. The mission of the conference was to identify unresolved questions and set future goals for research into human milk composition, mammary development and lactation. We first outline the unanswered questions regarding the composition of human milk (Section I) and the mechanisms by which milk components affect neonatal development, growth and health and recommend models for future research. Emerging questions about how milk components affect cognitive development and behavioral phenotype of the offspring are presented in Section II. In Section III we outline the important unanswered questions about regulation of mammary gland development, the heritability of defects, the effects of maternal nutrition, disease, metabolic status, and therapeutic drugs upon the subsequent lactation. Questions surrounding breastfeeding practice are also highlighted. In Section IV we describe the specific nutritional challenges faced by three different populations, namely preterm infants, infants born to obese mothers who may or may not have gestational diabetes, and infants born to undernourished mothers. The recognition that multidisciplinary training is critical to advancing the field led us to formulate specific training recommendations in Section V. Our recommendations for research emphasis are summarized in Section VI. In sum, we present a roadmap for multidisciplinary research into all aspects of human lactation, milk and its role in infant nutrition for the next decade and beyond
Soil-type influences human selenium status and underlies widespread selenium deficiency risks in Malawi
Selenium (Se) is an essential human micronutrient with critical roles in immune functioning and antioxidant defence. Estimates of dietary Se intakes and status are scarce for Africa although crop surveys indicate deficiency is probably widespread in Malawi. Here we show that Se deficiency is likely endemic in Malawi based on the Se status of adults consuming food from contrasting soil types. These data are consistent with food balance sheets and composition tables revealing that >80% of the Malawi population is at risk of dietary Se inadequacy. Risk of dietary Se inadequacy is >60% in seven other countries in Southern Africa, and 22% across Africa as a whole. Given that most Malawi soils cannot supply sufficient Se to crops for adequate human nutrition, the cost and benefits of interventions to alleviate Se deficiency should be determined; for example, Se-enriched nitrogen fertilisers could be adopted as in Finland
Haemoglobin status of adult non-pregnant Kazakh women living in Kzyl-Orda region, Kazakhstan.
OBJECTIVE: To estimate the prevalence of anaemia among adult non-pregnant women in the Kzyl-Orda region of Kazakhstan, and to determine the association between haemoglobin concentration and anthropometric, socioeconomic, reproductive and dietary factors. DESIGN: A cross-sectional study using a randomly selected sample. Subjects were interviewed, and finger-prick blood samples and anthropometric measurements were collected. Associations between haemoglobin concentration and anthropometric and questionnaire data were evaluated by sequential linear regression analysis. SETTING: Health centres in Kazalinsk, Djalagash and Zhanakorgan districts of Kzyl-Orda region, Kazakhstan. SUBJECTS: Three-thousand six-hundred and twenty-five non-pregnant women aged 18-45 y randomly selected from health centre records. RESULTS: Iron deficiency anaemia, as reflected by low haemoglobin levels (Hb<12 g/dl), was detected in 40.2% of the total sample. There was a significant curvilinear relationship between haemoglobin concentration and age, with the nadir of the curve in the 30-40 y age-group. Haemoglobin concentration was found to be positively associated with body mass index (BMI) and socioeconomic factors. Significant negative associations were found between haemoglobin concentration and duration of menses, use of the intra-uterine contraceptive device and the consumption of tea. CONCLUSIONS: This study demonstrates that iron deficiency anaemia is present at considerable levels among adult women living in Kzyl-Orda region, Kazakhstan, and provides important baseline information for future research and public health interventions. SPONSORSHIP: Funding was provided by the United States Agency for International Development, Office of Nutrition, the United Kingdom Department for International Development, and the Polden-Puckham Trust
- …
