22 research outputs found
Cash or Food? Which Works Better to Improve Nutrition Status and Treatment Adherence for HIV Patients Starting Antiretroviral Therapy
The overall objective of this DFID-funded study was to understand whether cash or food transfers
were more effective for HIV-positive individuals starting antiretroviral therapy (ART) in improving nutrition, health status and adherence to ART. HIV-positive individuals initiating ART at the St Francis Mission Hospital in Katete District, Eastern Province, were randomly allocated to two treatment groups (cash and food), and given a food basket or its cash equivalent monthly, for eight months. Both treatment groups saw significant increases (p-value <0.001) in Body Mass Index (BMI), Household Dietary Diversity Score, good adherence to ART, and in mean CD4 count, but there were no significant differences between the two treatment groups in these measures. The study concluded that the provision of cash or food for eight months when clients start ART confers similar and significantly positive effects in improving clientsâ nutrition and health. Providing cash is likely to be more cost-effective
Childrenâs and adolescentsâ rising animal-source food intakes in 1990â2018 were impacted by age, region, parental education and urbanicity
Animal-source foods (ASF) provide nutrition for children and adolescentsâ physical and cognitive development. Here, we use data from the Global Dietary Database and Bayesian hierarchical models to quantify global, regional and national ASF intakes between 1990 and 2018 by age group across 185 countries, representing 93% of the worldâs child population. Mean ASF intake was 1.9 servings per day, representing 16% of children consuming at least three daily servings. Intake was similar between boys and girls, but higher among urban children with educated parents. Consumption varied by age from 0.6 at <1 year to 2.5 servings per day at 15â19 years. Between 1990 and 2018, mean ASF intake increased by 0.5 servings per week, with increases in all regions except sub-Saharan Africa. In 2018, total ASF consumption was highest in Russia, Brazil, Mexico and Turkey, and lowest in Uganda, India, Kenya and Bangladesh. These findings can inform policy to address malnutrition through targeted ASF consumption programmes.publishedVersio
Incident type 2 diabetes attributable to suboptimal diet in 184 countries
The global burden of diet-attributable type 2 diabetes (T2D) is not well established. This risk assessment model estimated T2D incidence among adults attributable to direct and body weight-mediated effects of 11 dietary factors in 184 countries in 1990 and 2018. In 2018, suboptimal intake of these dietary factors was estimated to be attributable to 14.1 million (95% uncertainty interval (UI), 13.8â14.4 million) incident T2D cases, representing 70.3% (68.8â71.8%) of new cases globally. Largest T2D burdens were attributable to insufficient whole-grain intake (26.1% (25.0â27.1%)), excess refined rice and wheat intake (24.6% (22.3â27.2%)) and excess processed meat intake (20.3% (18.3â23.5%)). Across regions, highest proportional burdens were in central and eastern Europe and central Asia (85.6% (83.4â87.7%)) and Latin America and the Caribbean (81.8% (80.1â83.4%)); and lowest proportional burdens were in South Asia (55.4% (52.1â60.7%)). Proportions of diet-attributable T2D were generally larger in men than in women and were inversely correlated with age. Diet-attributable T2D was generally larger among urban versus rural residents and higher versus lower educated individuals, except in high-income countries, central and eastern Europe and central Asia, where burdens were larger in rural residents and in lower educated individuals. Compared with 1990, global diet-attributable T2D increased by 2.6 absolute percentage points (8.6 million more cases) in 2018, with variation in these trends by world region and dietary factor. These findings inform nutritional priorities and clinical and public health planning to improve dietary quality and reduce T2D globally.publishedVersio
Impact of biofortified maize consumption on serum carotenoid concentrations in Zambian children
Biofortified maize, designed as an intervention strategy to prevent vitamin A deficiency, can provide upwards of 15âÎŒg ÎČ-carotene per g dry weight. Some varieties also have elevated concentrations of other carotenoids. We conducted a cluster randomized, controlled feeding trial in rural Zambia to test the impact of daily consumption of biofortified maize over a 6-month period on vitamin A status. Serum concentrations of retinol and carotenoids were assessed by high-performance liquid chromatography. Data on circulating carotenoids by intervention group in 679 children are reported here. As previously shown, consumption of this ÎČ-carotene-rich maize significantly improved serum ÎČ-carotene concentrations (0.273 vs. 0.147âÎŒmol/L, pâ\u3câ0.001, in this subset of children). Here we show significant increases in α-carotene, ÎČ-cryptoxanthin, and zeaxanthin (pâ\u3câ0.001). There was no impact on lutein or lycopene concentrations. Consumption of biofortified maize can have broader implications beyond the control of vitamin A deficiency (Trial registration: NCT01695148)
Ageâspecific differences in the magnitude of malariaârelated anemia during low and high malaria seasons in rural Zambian children
Abstract Background Malaria causes anemia by destruction of red blood cells and inhibition of erythropoiesis. Objective We assessed whether the magnitude of the malariaâspecific effect on anemia differs by age, during low and high malaria seasons. Method In rural Zambian children participating in a proâvitamin A efficacy trial, we estimated differences in the prevalence of anemia (defined as hemoglobin < 110 g/L for children < 60 months. and < 115 g/L in older children) by malaria status and assessed malariaâage interactions. Regression models (with anemia as the outcome) were used to model malariaâage interaction in both the low and high malaria seasons, controlling for potential confounders. Results Average age was 68 months at baseline (n = 820 children). In the low malaria season, anemia prevalence was 29% in malariaânegative children and 54% in malariaâpositive children (p < 0.001), with no malariaâage interactions (p = 0.44). In the high malaria season, anemia prevalence was 41% in malariaânegative children and 54% in malariaâpositive children (p < 0.001), with significant malariaâage interactions (p = 0.02 for anemia). Ageâstratified prevalence of anemia in malaria positive versus negative children was 67.0% versus 37.1% (in children < 60 months); 57.0% versus 37.2% (in 60â69 months.); 46.8% versus 37.2% (in 70â79 months.); 37.0% versus 37.3% (in 80â89 months) and 28.0% versus 37.4% (in 90+ months). Conclusions Malarial anemia is most severe in younger children, especially when transmission is intense. Anemia control programs must prioritize this vulnerable group
Malaria exacerbates inflammation-associated elevation in ferritin and soluble transferrin receptor with only modest effects on iron deficiency and iron deficiency anaemia among rural Zambian children
Objective: In 4â to 8âyearâold Zambian children (n = 744), we evaluated the effects of adjusting for inflammation (α1âacid glycoprotein \u3e1 g/l), with or without additional adjustment for malaria, on prevalence estimates of iron deficiency (ID) and iron deficiency anaemia (IDA) during low malaria (LowM) and high malaria (HighM) transmission seasons.
Methods: To estimate adjustment factors, children were classified as: (i) reference (malaria negative without inflammation), (ii) inflammation without malaria (I), (iii) malaria without inflammation (M) and (iv) inflammation with malaria (IM). We estimated the unadjusted ID or IDA prevalence, and then adjusted for inflammation alone (IDI or IDAI) or inflammation and malaria (IDIM or IDAIM).
Results: Mean ferritin was 38 (reference), 45 (I), 43 (M) and 54 ÎŒg/l (IM) in LowM, increasing to 44, 56, 96 and 167 ÎŒg/l, respectively, in HighM. Corresponding mean sTfR was 6.4, 6.9, 7.9 and 8.4 mg/l in LowM, increasing to 8.2, 9.2. 8.7 and 9.7 mg/l in HighM. Ferritinâbased ID, IDI and IDIM were 7.8%, 8.7% or 9.1%, respectively, in LowM and 4.6%, 10.0% or 11.7%, respectively, in HighM. Corresponding soluble transferrin receptor (sTfR)âbased estimates were 27.0%, 24.1% and 19.1%, respectively, in LowM, increasing to 53.6%, 46.5% and 45.3%, respectively, in HighM. Additional adjustment for malaria resulted in a ~1â to 2âpercentage point change in IDA, depending on biomarker and season.
Conclusions: In this population, malaria substantially increased ferritin and sTfR concentrations, with modest effects on ID and IDA prevalence estimates