104 research outputs found
Dietary vitamin D consumption, sunlight exposure, sunscreen use and parental knowledge of vitamin D sources in a cohort of children aged 1–6 years in North West England
This article has been accepted for publication and will appear in a revised form, subsequent to peer review and/or editorial input by Cambridge University Press, in Proceedings of the Nutrition Society published by Cambridge University Press. Copyright The Authors 2015.Hospital admission for children with rickets in England has dramatically increased, from <1 child per 100,000 in the early 1990's to 4·78 (4·58–4·99) per 100 000 between 2007 and 2011( 1 ). The re-emergence of rickets thus suggests poor vitamin D status( 2 ). Additionally, there has been a plethora of publications associating low vitamin D status with many adverse health outcomes other than the classical role of vitamin D in the development, maintenance and function of a healthy skeleton( 3 ). Vitamin D is a fat lipophilic steroid pro hormone obtained from few foods in the diet. However, the majority (90–95%) of vitamin D is synthesised from exposure of bare skin to sunlight( 4 ), and casual sunlight exposure has been considered adequate for the majority of the population. Consequently, there is no reference nutrient intake (RNI) for ages 4–65 yrs( 5 ). With modern indoor lifestyles, cautious sun screen usage and changes in food habits, sunlight exposure may no longer be sufficient to maintain adequate vitamin D status. To avoid vitamin D deficiency, supplementation and fortification may need to play a more prominent role in everyday lives( 6 ).
The aim of the present study was to investigate vitamin D dietary intake in children, parents’ knowledge of vitamin D sources, children's outdoor habits and sun screen application practices. A retrospective, cross sectional study approach was utilised. Parents of children (n = 42) aged between 1 and 6 yrs completed a semi-validated food frequency questionnaire, a sources of vitamin D knowledge questionnaire, and a sunlight exposure and sunscreen use questionnaire, in Adlington, N.W. England (latitude 55oN) during May 2013.
Children's mean (±SD) dietary vitamin D intake was 4·4 ± 2·5 μg/d, significantly lower than 7μg/d (P = < 0·001, for comparison 7μg/d, the RNI for ages 3 months-4 yrs was used). As expected, children taking supplements had a significantly higher mean (±SD) vitamin D intake (8·49 ± 1·78 μg/d) compared to those that did not supplement (3·34 ± 1·23 μg/d, P < 0·001). The greatest contribution to dietary vitamin D intake from food was from butter and spreads (0·028μg/d), followed by cakes, biscuits & scones (0·023μg/d). Parents' knowledge of food sources was poor, with a mean (±SD) incorrect response of 76% ±11·2. Contrastingly, 93% correctly identified sunlight exposure as a potential source of vitamin D. Eighty nine percent of participants played outdoors daily for 1 hour or more, 81% used sunscreen with an SPF ≥30 and only 2% rarely applied sunscreen.
This study revealed that children's diet in the NW England is lacking sufficient vitamin D content, in line with larger surveys( 7 , 8 ). Parents' knowledge regarding vitamin D dietary sources was poor but 93% of parents knew that sunlight was the non-dietary source of vitamin D. Outdoor play indicated sufficient exposure time to produce endogenous vitamin D but sunscreen usage may have potentially diminished epidermal UVB exposures.
Further research is needed using biomarkers to confirm vitamin D insufficiency, and public health strategies should be implemented to promote existing recommendations regarding supplementation and consumption of vitamin D rich foods. Additionally, guidelines for safe sun exposure and sunscreen use are required
Recommended from our members
The use of diaries in psychological recovery from intensive care
Intensive care patients frequently experience memory loss, nightmares, and delusional memories and some may develop symptoms of anxiety, depression, and post-traumatic stress. The use of diaries is emerging as a putative tool to 'fill the memory gaps' and promote psychological recovery. In this review, we critically analyze the available literature regarding the use and impact of diaries for intensive care patients specifically to examine the impact of diaries on intensive care patients' recovery. Diversity of practice in regard to the structure, content, and process elements of diaries for intensive care patients exists and emphasizes the lack of an underpinning psychological conceptualization. The use of diaries as an intervention to aid psychological recovery in intensive care patients has been examined in 11 studies, including two randomized controlled trials. Inconsistencies exist in sample characteristics, study outcomes, study methods, and the diary intervention itself, limiting the amount of comparison that is possible between studies. Measurement of the impact of the diary intervention on patient outcomes has been limited in both scope and time frame. Furthermore, an underpinning conceptualization or rationale for diaries as an intervention has not been articulated or tested. Given these significant limitations, although findings tend to be positive, implementation as routine clinical practice should not occur until a body of evidence is developed to inform methodological considerations and confirm proposed benefits
Shorting the Climate: Fossil Fuel Finance Report Card 2016
This seventh annual report card on energy financing evaluates top global private sector banks based on their financing for the fossil fuel industry. For 2016, the report has been expanded to high-risk subsectors of the oil and gas industry. It also analyzes patterns of private bank financing for coal, oil, and gas projects that have been financially disastrous and inflicted severe damage on communities, ecosystems, and the climate. The report identifies pervasive risk management failures across the North American and European banking sector on fossil fuel financing and calls for a fundamental realignment of bank energy financing to end support for fossil fuel projects and companies that are incompatible with climate stabilization.In the past three years, the North American and European commercial and investment banking sector has engaged in fossil fuel financing practices that are deeply at odds with the global climate agreement reached at COP 21 last December. The Paris Climate Agreement's target of limiting warming to 1.5°C (or, at most, 2°C) above pre-industrial levels will require a rapid decarbonization of the global energy system. Distressingly, levels of fossil fuel financing by major North American and European banks between 2013 and 2015 are incompatible with these climate stabilization targets:Coal mining - As leaders of climate-vulnerable states called for a global moratorium on new coal mines, top banks financed 6.73 billion.Coal power - In spite of a recent study concluding that the current pipeline of planned coal power plants would put the 2°C climate target out of reach by the end of 2017, these banks financed 24.06 billion.Extreme oil (Arctic, tar sands, and ultra-deep offshore) - Future development of most of these high-cost, highrisk oil reserves is incompatible with even the 2°C target, but banks financed 37.77 billion.Liquefied Natural Gas (LNG) export - Banks financed 30.58 billion, for companies involved with LNG export terminals in North America, which have enormous carbon footprints and are stranded assets in the making based on a 2°C climate scenario.Under pressure from global civil society, several U.S. and European banks have announced restrictions on financing for coal since last year. However, most of these policies fall well short of the necessary full phase-out of financing for coal mining and coal power production; as the report's grades for extreme oil and LNG export finance indicate, banks continue to finance these sectors on a nearly unrestricted basis. Banks also continue to fall distressingly short of their human rights obligations according to the United Nations Guiding Principles on Business and Human Rights, leaving banks complicit in human rights abuses by several of their corporate clients in the fossil fuel industry
Hepatitis C reinfection following treatment induced viral clearance among people who have injected drugs
Background:
Although people who inject drugs (PWID) are an important group to receive Hepatitis C Virus (HCV) antiviral therapy, initiation onto treatment remains low. Concerns over reinfection may make clinicians reluctant to treat this group. We examined the risk of HCV reinfection among a cohort of PWID (encompassing all those reporting a history of injecting drug use) from Scotland who achieved a sustained virological response (SVR).
Methods:
Clinical and laboratory data were used to monitor RNA testing among PWID who attained SVR following therapy between 2000 and 2009. Data were linked to morbidity and mortality records. Follow-up began one year after completion of therapy, ending on 31st December, 2012. Frequency of RNA testing during follow-up was calculated and the incidence of HCV reinfection estimated. Cox proportional hazards regression was used to examine factors associated with HCV reinfection.
Results:
Among 448 PWID with a SVR, 277 (61.8%) were tested during follow-up, median 4.5 years; 191 (69%) received one RNA test and 86 (31%) received at least two RNA tests. There were seven reinfections over 410 person years generating a reinfection rate of 1.7/100 py (95% CI 0.7–3.5). For PWID who have been hospitalised for an opiate or injection related cause post SVR (11%), the risk of HCV reinfection was greater [AHR = 12.9, 95% CI 2.2–76.0, p = 0.002] and the reinfection rate was 5.7/100 py (95% CI 1.8–13.3).
Conclusion:
PWID who have been tested, following SVR, for HCV in Scotland appear to be at a low risk of reinfection. Follow-up and monitoring of this population are warranted as treatment is offered more widely
A comment on the PCAST report:skip the “match”/“non-match” stage
This letter comments on the report “Forensic science in criminal courts: Ensuring scientific validity of feature-comparison methods” recently released by the President's Council of Advisors on Science and Technology (PCAST). The report advocates a procedure for evaluation of forensic evidence that is a two-stage procedure in which the first stage is “match”/“non-match” and the second stage is empirical assessment of sensitivity (correct acceptance) and false alarm (false acceptance) rates. Almost always, quantitative data from feature-comparison methods are continuously-valued and have within-source variability. We explain why a two-stage procedure is not appropriate for this type of data, and recommend use of statistical procedures which are appropriate
Recommended from our members
Effect of a 2-tier rapid response system on patient outcome and staff satisfaction
Background: Rapid response systems (RRS) have been recommended as a strategy to prevent and treat deterioration in acute care patients. Questions regarding the most effective characteristics of RRS and strategies for implementing these systems remain.
Aims: The aims of this study were to (i) describe the structures and processes used to implement a 2-tier RRS, (ii) determine the comparative prevalence of deteriorating patients and incidence of unplanned intensive care unit (ICU) admission and cardiac arrest prior to and after implementation of the RRS, and (iii) determine clinician satisfaction with the RRS.
Method: A quasi-experimental pre-test, post-test design was used to assess patient related outcomes and clinician satisfaction prior to and after implementation of a 2-tier RRS in a tertiary metropolitan hospital. Primary components of the RRS included an ICU Outreach Nurse and a Rapid Response Team. Prevalence of deteriorating patients was assessed through a point prevalence assessment and chart audit. Incidence of unplanned admission to ICU and cardiac arrests were accessed from routine hospital databases. Clinician satisfaction was measured through surveys.
Results: Prevalence of patients who met medical emergency call criteria without current treatment reduced from 3% prior to RRS implementation to 1% after implementation; a similar reduction from 9% to 3% was identified on chart review. The number of unplanned admissions to ICU increased slightly from 17.4/month prior to RRS implementation to 18.1/month after implementation (p = 0.45) while cardiac arrests reduced slightly from 7.5/month to 5.6/month (p = 0.22) but neither of these changes were statistically significant. Staff satisfaction with the RRS was generally high.
Conclusion: The 2-tier RRS was accessed by staff to assist with care of deteriorating patients in a large, tertiary hospital. High levels of satisfaction have been reported by clinical staff
Dietary assessment in minority ethnic groups: A systematic review of portion size estimation instruments relevant for the UK
This is a pre-copyedited, author-produced PDF of an article accepted for publication in Nutrition Reviews following peer review. The version of record Almiron-Roig, E., Galloway, C., Aitken, A. & Ellahi., B. (2016). Dietary assessment in minority ethnic groups: A systematic review of portion size estimation instruments relevant for the UK. Nutrition Reviews, 75(3), 188-213. DOI: 10.1093/nutrit/nuw058 is available online at: https://academic.oup.com/nutritionreviews/article-lookup/doi/10.1093/nutrit/nuw058Context: Dietary assessment in minority ethnic groups is critical for surveillance programmes and for implementing effective interventions. A major challenge is the accurate estimation of portion sizes for traditional foods/dishes. Objective: To systematically review published records up to 2014 describing a portion size estimation element (PSEE) applicable to dietary assessment of UK-residing ethnic minorities. Data sources, selection, extraction: Electronic databases, internet sites, and theses repositories were searched generating 5683 titles from which 57 eligible full-text records were reviewed. Data analysis: Forty-two publications aimed at minority ethnic groups (n=20) or autochthonous populations (n=22) were included. The most common PSEE (47%) were combination tools (e.g. food models and portion size lists); followed by portion size lists in questionnaires/guides (19%); image-based and volumetric tools (17% each). Only 17% PSEE had been validated against weighed data. Conclusions: When developing ethnic-specific dietary assessment tools it is important to consider customary portion sizes by sex and age; traditional household utensil usage and population literacy levels. Combining multiple PSEE may increase accuracy but such tools need validating
Virological failure and development of new resistance mutations according to CD4 count at combination antiretroviral therapy initiation
Objectives: No randomized controlled trials have yet reported an individual patient benefit of initiating combination antiretroviral therapy (cART) at CD4 counts > 350 cells/μL. It is hypothesized that earlier initiation of cART in asymptomatic and otherwise healthy individuals may lead to poorer adherence and subsequently higher rates of resistance development. Methods: In a large cohort of HIV-positive individuals, we investigated the emergence of new resistance mutations upon virological treatment failure according to the CD4 count at the initiation of cART. Results: Of 7918 included individuals, 6514 (82.3%), 996 (12.6%) and 408 (5.2%) started cART with a CD4 count ≤ 350, 351-499 and ≥ 500 cells/μL, respectively. Virological rebound occurred while on cART in 488 (7.5%), 46 (4.6%) and 30 (7.4%) with a baseline CD4 count ≤ 350, 351-499 and ≥ 500 cells/μL, respectively. Only four (13.0%) individuals with a baseline CD4 count > 350 cells/μL in receipt of a resistance test at viral load rebound were found to have developed new resistance mutations. This compared to 107 (41.2%) of those with virological failure who had initiated cART with a CD4 count < 350 cells/μL. Conclusions: We found no evidence of increased rates of resistance development when cART was initiated at CD4 counts above 350 cells/μL. HIV Medicin
- …