40 research outputs found
Associations of the 2018 World Cancer Research Fund/American Institute of Cancer Research (WCRF/AICR) cancer prevention recommendations with stages of colorectal carcinogenesis
Background: While adherence to cancer prevention recommendations is linked to lower risk of colorectal cancer (CRC), few have studied associations across the entire spectrum of colorectal carcinogenesis. Here, we studied the relationship of the standardized 2018 World Cancer Research Fund/American Institute for Cancer Research (WCRF/AICR) Score for cancer prevention recommendations with detection of colorectal lesions in a screening setting. As a secondary objective, we examined to what extent the recommendations were being followed in an external cohort of CRC patients.
Methods: Adherence to the seven-point 2018 WCRF/AICR Score was measured in screening participants receiving a positive fecal immunochemical test and in CRC patients participating in an intervention study. Dietary intake, body fatness and physical activity were assessed using self-administered questionnaires. Multinomial logistic regression was used to estimate odds ratios (ORs) and 95% confidence intervals (CIs) for screen-detected lesions.
Results: Of 1486 screening participants, 548 were free from adenomas, 524 had non-advanced adenomas, 349 had advanced lesions and 65 had CRC. Adherence to the 2018 WCRF/AICR Score was inversely associated with advanced lesions; OR 0.82 (95% CI 0.71, 0.94) per score point, but not with CRC. Of the seven individual components included in the score, alcohol, and BMI seemed to be the most influential. Of the 430 CRC patients included in the external cohort, the greatest potential for lifestyle improvement was seen for the recommendations concerning alcohol and red and processed meat, where 10% and 2% fully adhered, respectively.
Conclusions: Adherence to the 2018 WCRF/AICR Score was associated with lower probability of screen-detected advanced precancerous lesions, but not CRC. Although some components of the score seemed to be more influential than others (i.e., alcohol and BMI), taking a holistic approach to cancer prevention is likely the best way to prevent the occurrence of precancerous colorectal lesions
Agreement between the GLIM criteria and PG-SGA in a mixed patient population at a nutrition outpatient clinic
Background & aims
The Global Leadership Initiative on Malnutrition (GLIM) criteria is a step-wise process including a screening tool of choice for risk assessment of malnutrition before assessment of diagnosis and grading of malnutrition severity. The agreement between GLIM and the established malnutrition assessment method Patient Generated-Subjective Global Assessment (PG-SGA) is uncertain. Also, several aspects of GLIM remain to be clearly defined. In this study, we compared diagnosis of malnutrition with the GLIM criteria to the PG-SGA, and explored the differences between the methods.
Methods
This cross-sectional study was conducted at the Nutrition Outpatient Clinic at Oslo University Hospital, Norway. Patients were included from September–December 2019. Nutritional Risk Screening 2002 (NRS-2002) was used as the screening tool in the GLIM process before diagnosing and grading the severity of malnutrition. Results are presented with and without the initial risk screening. The diagnostic results from the GLIM process were compared to the malnutrition diagnosis using the PG-SGA.
Results
In total, 144 patients, median age 58 years, participated in the study. The full GLIM process identified 36% of the patients as malnourished, while the PG-SGA identified 69% of the patients as malnourished. Comparison of GLIM and PG-SGA showed fair agreement, however the agreement was better when the NRS-2002 screening was excluded. Considering the PG-SGA a gold standard, GLIM had a sensitivity of 51% and a specificity of 98%. The introduction of new cut-off values for fat-free mass did not considerably alter the diagnosis of malnutrition within GLIM.
Conclusions
The GLIM criteria showed only fair agreement with the PG-SGA, however the agreement was better when the initial NRS-2002 screening was excluded. A joint consensus on how to perform the GLIM process is needed for comparisons of future studies, and before routine use in clinical practice
Dietitians' experiences of nutrition assessment via TeleNutrition : "Video-calls are better than phone-calls, but it's probably difficult for patients to show their ankles on the screen"
Background & Aims: Nutrition assessment is integral to dietetic practice. TeleNutrition enabled dietitians to continue nutrition care provision during the COVID-19 pandemic but created challenges with undertaking nutrition assessment. The aim of the present study was to describe how dietitians in three Nordic countries perceived their nutrition assessment practice when physically distant from patients. Methods: The present study is a sub-analysis from one research project undertaken by the Global TeleNutrition Consortium, GTNC. Data was generated from a digital survey of a convenience sample of dietitians in Denmark, Norway, and Sweden who had a minimum of one adult patient interaction per week, distributed through the dietetic professional and/or regulatory bodies of each country, as well as closed social networks. Data from free-text questions were assessed using thematic analysis where the construction of final themes were guided by the Technology Acceptance Model (TAM). Results: In total, 146 dietitians participated in the study (Denmark 16%, Norway 34%, and Sweden 50%). The qualitative analysis of answers from 24 free-text questions resulted in four themes (key constructs) related to dietitians' experience of performing nutrition assessment using TeleNutrition: Perceived usefulness, Perceived ease of use, Perceived barriers, and Perceived facilitators. Each theme was divided into two to three sub-themes (explanatory dimensions). Conclusions: To best support dietitians in the new era of healthcare digitalisation, internationally accepted standards or protocols for performing nutrition assessment using TeleNutrition ought to be established. This is especially critical for nutrition assessment measures that require physical examination
Barriers and Facilitators for Implementing a Decision Support System to Prevent and Treat Disease-Related Malnutrition in a Hospital Setting: Qualitative Study
Background: Disease-related malnutrition is a challenge among hospitalized patients. Despite guidelines and recommendations for prevention and treatment, the condition continues to be prevalent. The MyFood system is a recently developed decision support system to prevent and treat disease-related malnutrition. Objective: To investigate the possible implementation of the MyFood system in clinical practice, the aims of the study were (1) to identify current practice, routines, barriers, and facilitators of nutritional care; (2) to identify potential barriers and facilitators for the use of MyFood; and (3) to identify the key aspects of an implementation plan. Methods: A qualitative study was performed among nurses, physicians, registered dietitians, and middle managers in 2 departments in a university hospital in Norway. Focus group discussions and semistructured interviews were used to collect data. The Consolidated Framework for Implementation Research (CFIR) was used to create the interview guide and analyze the results. The transcripts were analyzed using a thematic analysis. Results: A total of 27 health care professionals participated in the interviews and focus groups, including nurses (n=20), physicians (n=2), registered dietitians (n=2), and middle managers (n=3). The data were analyzed within 22 of the 39 CFIR constructs. Using the 5 CFIR domains as themes, we obtained the following results: (1) Intervention characteristics: MyFood was perceived to have a relative advantage of being more trustworthy, systematic, and motivational and providing increased awareness of nutritional treatment compared with the current practice. Its lack of communication with the existing digital systems was perceived as a potential barrier; (2) Outer settings: patients from different cultural backgrounds with language barriers and of older age were potential barriers for the use of the MyFood system; (3) Inner settings: no culture for specific routines or systems related to nutritional care existed in the departments. However, tension for change regarding screening for malnutrition risk, monitoring and nutritional treatment was highlighted in all categories of interviewees; (4) Characteristics of the individuals: positive attitudes toward MyFood were present among the majority of the interviewees, and they expressed self-efficacy toward the perceived use of MyFood; (5) Process: providing sufficient information to everyone in the department was highlighted as key to the success of the implementation. The involvement of opinion leaders, implementation leaders, and champions was also suggested for the implementation plan. Conclusions: This study identified several challenges in the nutritional care of hospitalized patients at risk of malnutrition and deviations from recommendations and guidelines. The MyFood system was perceived as being more precise, trustworthy, and motivational than the current practice. However, several potential barriers were identified. The assessment of the current situation and the identification of perceived barriers and facilitators will be used in planning an implementation and effect study, including the creation of an implementation plan
Effects of using the MyFood decision support system on hospitalized patients' nutritional status and treatment: A randomized controlled trial
Background & aims
Compliance to guidelines for disease-related malnutrition is documented as poor. The practice of using paper-based dietary recording forms with manual calculation of the patient's nutritional intake is considered cumbersome, time-consuming and unfeasible among the nurses and does often not lead to appropriate nutritional treatment. We developed the digital decision support system MyFood to deliver a solution to these challenges. MyFood is comprised of an app for patients and a website for nurses and includes functions for dietary recording, evaluation of intake compared to requirements, and a report to nurses including tailored recommendations for nutritional treatment and a nutritional care plan for documentation. The study aimed to investigate the effects of using the MyFood decision support system during hospital stay on adult patients' nutritional status, treatment and hospital length of stay. The main outcome measure was weight change.
Methods
The study was a parallel-arm randomized controlled trial. Patients who were allocated to the intervention group used the MyFood app during their hospital stay and the nurses were encouraged to use the MyFood system. Patients who were allocated to the control group received routine care.
Results
We randomly assigned 100 patients (51.9 ± 14 y) to the intervention group (n = 49) and the control group (n = 51) between August 2018 and February 2019. Losses to follow-up were n = 5 in the intervention group and n = 1 in the control group. No difference was found between the two groups with regard to weight change. Malnutrition risk at discharge was present in 77% of the patients in the intervention group and 94% in the control group (p = 0.019). Nutritional treatment was documented for 81% of the patients in the intervention group and 57% in the control group (p = 0.011). A nutritional care plan was created for 70% of the intervention patients compared to 16% of the control patients (p < 0.001).
Conclusions
The intervention had no effect on weight change during hospital stay. A higher proportion of the patients in the control group was malnourished or at risk of malnutrition at hospital discharge compared to the patients in the intervention group. The documentation of nutritional intake, treatment and nutritional care plans was higher for the patients using the MyFood system compared to the control group. This trial was registered at clinicaltrials.gov (NCT03412695)
Polyphenol-rich juices reduce blood pressure measures in a randomised controlled trial in high normal and hypertensive volunteers
Intake of fruits and berries may lower blood pressure (BP), most probably due to the high content of polyphenols. In the present study, we tested whether consumption of two polyphenol-rich juices could lower BP. In a randomised, double-blinded, placebo-controlled trial of 12 weeks, 134 healthy individuals, aged 50–70 years, with high normal range BP (130/85–139/89 mmHg, seventy-two subjects) or stage 1-2 hypertension (140/90–179/109 mmHg, sixty-two subjects), were included. They consumed 500 ml/d of one of either (1) a commercially available polyphenol-rich juice based on red grapes, cherries, chokeberries and bilberries; (2) a juice similar to (1) but enriched With polyphenol-rich extracts from blackcurrant press-residue or (3) a placebo juice (polyphenol contents 245·5, 305·2 and 76 mg/100 g, respectively). Resting BP was measured three times, with a 1 min interval, at baseline and after 6 and 12 weeks of intervention. Systolic BP significantly reduced over time (6 and 12 weeks, respectively) in the pooled juice group compared with the placebo group in the first of the three measurements, both for the whole study group (6·9 and 3·4 mmHg; P¼0·01) and even more pronounced in the hypertensive subjects when analysed separately (7·3 and 6·8 mmHg; P¼0·04). The variation in the BP measurements was significantly reduced in the pooled juice group compared with the placebo group (1·4 and 1·7 mmHg; P¼0·03). In conclusion, the present findings suggest that polyphenol-rich berry juice may contribute to a BP- and BP variability lowering effect, being more pronounced in hypertensive than in normotensive subjects.Polyphenol-rich juices reduce blood pressure measures in a randomised controlled trial in high normal and hypertensive volunteerssubmittedVersio
Dietary Adjustments to Altitude Training in Elite Endurance Athletes; Impact of a Randomized Clinical Trial With Antioxidant-Rich Foods
Background: Altitude training stresses several physiological and metabolic processes and alters the dietary needs of the athletes. International Olympic Committee (IOC)'s Nutrition Expert Group suggests that athletes should increase intake of energy, carbohydrate, iron, fluid, and antioxidant-rich foods while training at altitude.
Objective: We investigated whether athletes adjust their dietary intake according to the IOC's altitude-specific dietary recommendations, and whether an in-between meal intervention with antioxidant-rich foods altered the athletes' dietary composition and nutrition-related blood parameters (mineral, vitamin, carotenoid, and hormone concentrations).
Design: The dietary adjustments to altitude training (3 weeks at 2,320 m) were determined for 31 elite endurance athletes (23 ± 5 years, 23 males, 8 females) by six interviewer-administered 24-h dietary recalls on non-consecutive days; three before and during the altitude camp. The additional effect of in -between meal intervention with eucaloric antioxidant-rich or control snacks (1,000 kcal/day) was tested in a randomized controlled trial with parallel design.
Results: At altitude the athletes increased their energy intake by 35% (1,430 ± 630 kcal/day, p < 0.001), the provided snacks accounting for 70% of this increase. Carbohydrate intake increased from 6.5 ± 1.8 g/kg body weight (BW) (50 E%) to 9.3 ± 2.1 g/kg BW (53 E%) (p < 0.001), with no difference between the antioxidant and control group. Dietary iron, fluid, and antioxidant-rich food intake increased by 37, 38, and 104%, respectively, in the whole cohort. The intervention group had larger increases in polyunsaturated fatty acids (PUFA), ω3 PUFA (n-3 fatty acids), ω6 PUFA (n-6 fatty acids), fiber, vitamin C, folic acid, and copper intake, while protein intake increased more among the controls, reflecting the nutritional content of the snacks. Changes in the measured blood minerals, vitamins, and hormones were not differentially affected by the intervention except for the carotenoid; zeaxanthin, which increased more in the intervention group (p < 0.001).
Conclusions: Experienced elite endurance athletes increased their daily energy, carbohydrate, iron, fluid, and antioxidant-rich food intake during a 3-week training camp at moderate altitude meeting most of the altitude-specific dietary recommendations. The intervention with antioxidant-rich snacks improved the composition of the athletes' diets but had minimal impact on the measured nutrition-related blood parameters
Validity of bioelectrical impedance analysis in estimation of fat-free mass in colorectal cancer patients.
Background & aims: Bioelectrical impedance analysis (BIA) is an accessible and cheap method to measure fat-free mass (FFM). However, BIA estimates are subject to uncertainty in patient populations with altered body composition and hydration. The aim of the current study was to validate a whole-body and a segmental BIA device against dual-energy X-ray absorptiometry (DXA) in colorectal cancer (CRC) patients, and to investigate the ability of different empiric equations for BIA to predict DXA FFM (FFMDXA).
Methods: Forty-three non-metastatic CRC patients (aged 50–80 years) were enrolled in this study. Whole-body and segmental BIA FFM estimates (FFMwhole-bodyBIA, FFMsegmentalBIA) were calculated using 14 empiric equations, including the equations from the manufacturers, before comparison to FFMDXA estimates.
Results: Strong linear relationships were observed between FFMBIA and FFMDXA estimates for all equations (R2 = 0.94–0.98 for both devices). However, there were large discrepancies in FFM estimates depending on the equations used with mean differences in the ranges −6.5–6.8 kg and −11.0–3.4 kg for whole-body and segmental BIA, respectively. For whole-body BIA, 77% of BIA derived FFM estimates were significantly different from FFMDXA, whereas for segmental BIA, 85% were significantly different. For whole-body BIA, the Schols* equation gave the highest agreement with FFMDXA with mean difference ±SD of −0.16 ± 1.94 kg (p = 0.582). The manufacturer's equation gave a small overestimation of FFM with 1.46 ± 2.16 kg (p < 0.001) with a tendency towards proportional bias (r = 0.28, p = 0.066). For segmental BIA, the Heitmann* equation gave the highest agreement with FFMDXA (0.17 ± 1.83 kg (p = 0.546)). Using the manufacturer's equation, no difference in FFM estimates was observed (−0.34 ± 2.06 kg (p = 0.292)), however, a clear proportional bias was detected (r = 0.69, p < 0.001). Both devices demonstrated acceptable ability to detect low FFM compared to DXA using the optimal equation.
Conclusion: In a population of non-metastatic CRC patients, mostly consisting of Caucasian adults and with a wide range of body composition measures, both the whole-body BIA and segmental BIA device provide FFM estimates that are comparable to FFMDXA on a group level when the appropriate equations are applied. At the individual level (i.e. in clinical practice) BIA may be a valuable tool to identify patients with low FFM as part of a malnutrition diagnosis