101 research outputs found

    Dietary vitamin D consumption, sunlight exposure, sunscreen use and parental knowledge of vitamin D sources in a cohort of children aged 1–6 years in North West England

    Get PDF
    This article has been accepted for publication and will appear in a revised form, subsequent to peer review and/or editorial input by Cambridge University Press, in Proceedings of the Nutrition Society published by Cambridge University Press. Copyright The Authors 2015.Hospital admission for children with rickets in England has dramatically increased, from <1 child per 100,000 in the early 1990's to 4·78 (4·58–4·99) per 100 000 between 2007 and 2011( 1 ). The re-emergence of rickets thus suggests poor vitamin D status( 2 ). Additionally, there has been a plethora of publications associating low vitamin D status with many adverse health outcomes other than the classical role of vitamin D in the development, maintenance and function of a healthy skeleton( 3 ). Vitamin D is a fat lipophilic steroid pro hormone obtained from few foods in the diet. However, the majority (90–95%) of vitamin D is synthesised from exposure of bare skin to sunlight( 4 ), and casual sunlight exposure has been considered adequate for the majority of the population. Consequently, there is no reference nutrient intake (RNI) for ages 4–65 yrs( 5 ). With modern indoor lifestyles, cautious sun screen usage and changes in food habits, sunlight exposure may no longer be sufficient to maintain adequate vitamin D status. To avoid vitamin D deficiency, supplementation and fortification may need to play a more prominent role in everyday lives( 6 ). The aim of the present study was to investigate vitamin D dietary intake in children, parents’ knowledge of vitamin D sources, children's outdoor habits and sun screen application practices. A retrospective, cross sectional study approach was utilised. Parents of children (n = 42) aged between 1 and 6 yrs completed a semi-validated food frequency questionnaire, a sources of vitamin D knowledge questionnaire, and a sunlight exposure and sunscreen use questionnaire, in Adlington, N.W. England (latitude 55oN) during May 2013. Children's mean (±SD) dietary vitamin D intake was 4·4 ± 2·5 μg/d, significantly lower than 7μg/d (P = < 0·001, for comparison 7μg/d, the RNI for ages 3 months-4 yrs was used). As expected, children taking supplements had a significantly higher mean (±SD) vitamin D intake (8·49 ± 1·78 μg/d) compared to those that did not supplement (3·34 ± 1·23 μg/d, P < 0·001). The greatest contribution to dietary vitamin D intake from food was from butter and spreads (0·028μg/d), followed by cakes, biscuits & scones (0·023μg/d). Parents' knowledge of food sources was poor, with a mean (±SD) incorrect response of 76% ±11·2. Contrastingly, 93% correctly identified sunlight exposure as a potential source of vitamin D. Eighty nine percent of participants played outdoors daily for 1 hour or more, 81% used sunscreen with an SPF ≥30 and only 2% rarely applied sunscreen. This study revealed that children's diet in the NW England is lacking sufficient vitamin D content, in line with larger surveys( 7 , 8 ). Parents' knowledge regarding vitamin D dietary sources was poor but 93% of parents knew that sunlight was the non-dietary source of vitamin D. Outdoor play indicated sufficient exposure time to produce endogenous vitamin D but sunscreen usage may have potentially diminished epidermal UVB exposures. Further research is needed using biomarkers to confirm vitamin D insufficiency, and public health strategies should be implemented to promote existing recommendations regarding supplementation and consumption of vitamin D rich foods. Additionally, guidelines for safe sun exposure and sunscreen use are required

    Shorting the Climate: Fossil Fuel Finance Report Card 2016

    Get PDF
    This seventh annual report card on energy financing evaluates top global private sector banks based on their financing for the fossil fuel industry. For 2016, the report has been expanded to high-risk subsectors of the oil and gas industry. It also analyzes patterns of private bank financing for coal, oil, and gas projects that have been financially disastrous and inflicted severe damage on communities, ecosystems, and the climate. The report identifies pervasive risk management failures across the North American and European banking sector on fossil fuel financing and calls for a fundamental realignment of bank energy financing to end support for fossil fuel projects and companies that are incompatible with climate stabilization.In the past three years, the North American and European commercial and investment banking sector has engaged in fossil fuel financing practices that are deeply at odds with the global climate agreement reached at COP 21 last December. The Paris Climate Agreement's target of limiting warming to 1.5°C (or, at most, 2°C) above pre-industrial levels will require a rapid decarbonization of the global energy system. Distressingly, levels of fossil fuel financing by major North American and European banks between 2013 and 2015 are incompatible with these climate stabilization targets:Coal mining - As leaders of climate-vulnerable states called for a global moratorium on new coal mines, top banks financed 42.39billionforcompaniesactiveincoalmining,ledbyDeutscheBankwith42.39 billion for companies active in coal mining, led by Deutsche Bank with 6.73 billion.Coal power - In spite of a recent study concluding that the current pipeline of planned coal power plants would put the 2°C climate target out of reach by the end of 2017, these banks financed 154billionfortopoperatorsofcoalpowerplants,ledbyCitigroupwith154 billion for top operators of coal power plants, led by Citigroup with 24.06 billion.Extreme oil (Arctic, tar sands, and ultra-deep offshore) - Future development of most of these high-cost, highrisk oil reserves is incompatible with even the 2°C target, but banks financed 307billionforthetopownersoftheworldsuntapped"extremeoil"reserves,ledbyJPMorganChasewith307 billion for the top owners of the world's untapped "extreme oil" reserves, led by JPMorgan Chase with 37.77 billion.Liquefied Natural Gas (LNG) export - Banks financed 283billion,ledbyJPMorganChasewith283 billion, led by JPMorgan Chase with 30.58 billion, for companies involved with LNG export terminals in North America, which have enormous carbon footprints and are stranded assets in the making based on a 2°C climate scenario.Under pressure from global civil society, several U.S. and European banks have announced restrictions on financing for coal since last year. However, most of these policies fall well short of the necessary full phase-out of financing for coal mining and coal power production; as the report's grades for extreme oil and LNG export finance indicate, banks continue to finance these sectors on a nearly unrestricted basis. Banks also continue to fall distressingly short of their human rights obligations according to the United Nations Guiding Principles on Business and Human Rights, leaving banks complicit in human rights abuses by several of their corporate clients in the fossil fuel industry

    Hepatitis C reinfection following treatment induced viral clearance among people who have injected drugs

    Get PDF
    Background: Although people who inject drugs (PWID) are an important group to receive Hepatitis C Virus (HCV) antiviral therapy, initiation onto treatment remains low. Concerns over reinfection may make clinicians reluctant to treat this group. We examined the risk of HCV reinfection among a cohort of PWID (encompassing all those reporting a history of injecting drug use) from Scotland who achieved a sustained virological response (SVR). Methods: Clinical and laboratory data were used to monitor RNA testing among PWID who attained SVR following therapy between 2000 and 2009. Data were linked to morbidity and mortality records. Follow-up began one year after completion of therapy, ending on 31st December, 2012. Frequency of RNA testing during follow-up was calculated and the incidence of HCV reinfection estimated. Cox proportional hazards regression was used to examine factors associated with HCV reinfection. Results: Among 448 PWID with a SVR, 277 (61.8%) were tested during follow-up, median 4.5 years; 191 (69%) received one RNA test and 86 (31%) received at least two RNA tests. There were seven reinfections over 410 person years generating a reinfection rate of 1.7/100 py (95% CI 0.7–3.5). For PWID who have been hospitalised for an opiate or injection related cause post SVR (11%), the risk of HCV reinfection was greater [AHR = 12.9, 95% CI 2.2–76.0, p = 0.002] and the reinfection rate was 5.7/100 py (95% CI 1.8–13.3). Conclusion: PWID who have been tested, following SVR, for HCV in Scotland appear to be at a low risk of reinfection. Follow-up and monitoring of this population are warranted as treatment is offered more widely

    A comment on the PCAST report:skip the “match”/“non-match” stage

    Get PDF
    This letter comments on the report “Forensic science in criminal courts: Ensuring scientific validity of feature-comparison methods” recently released by the President's Council of Advisors on Science and Technology (PCAST). The report advocates a procedure for evaluation of forensic evidence that is a two-stage procedure in which the first stage is “match”/“non-match” and the second stage is empirical assessment of sensitivity (correct acceptance) and false alarm (false acceptance) rates. Almost always, quantitative data from feature-comparison methods are continuously-valued and have within-source variability. We explain why a two-stage procedure is not appropriate for this type of data, and recommend use of statistical procedures which are appropriate

    Dietary assessment in minority ethnic groups: A systematic review of portion size estimation instruments relevant for the UK

    Get PDF
    This is a pre-copyedited, author-produced PDF of an article accepted for publication in Nutrition Reviews following peer review. The version of record Almiron-Roig, E., Galloway, C., Aitken, A. & Ellahi., B. (2016). Dietary assessment in minority ethnic groups: A systematic review of portion size estimation instruments relevant for the UK. Nutrition Reviews, 75(3), 188-213. DOI: 10.1093/nutrit/nuw058 is available online at: https://academic.oup.com/nutritionreviews/article-lookup/doi/10.1093/nutrit/nuw058Context: Dietary assessment in minority ethnic groups is critical for surveillance programmes and for implementing effective interventions. A major challenge is the accurate estimation of portion sizes for traditional foods/dishes. Objective: To systematically review published records up to 2014 describing a portion size estimation element (PSEE) applicable to dietary assessment of UK-residing ethnic minorities. Data sources, selection, extraction: Electronic databases, internet sites, and theses repositories were searched generating 5683 titles from which 57 eligible full-text records were reviewed. Data analysis: Forty-two publications aimed at minority ethnic groups (n=20) or autochthonous populations (n=22) were included. The most common PSEE (47%) were combination tools (e.g. food models and portion size lists); followed by portion size lists in questionnaires/guides (19%); image-based and volumetric tools (17% each). Only 17% PSEE had been validated against weighed data. Conclusions: When developing ethnic-specific dietary assessment tools it is important to consider customary portion sizes by sex and age; traditional household utensil usage and population literacy levels. Combining multiple PSEE may increase accuracy but such tools need validating

    Virological failure and development of new resistance mutations according to CD4 count at combination antiretroviral therapy initiation

    Get PDF
    Objectives: No randomized controlled trials have yet reported an individual patient benefit of initiating combination antiretroviral therapy (cART) at CD4 counts > 350 cells/μL. It is hypothesized that earlier initiation of cART in asymptomatic and otherwise healthy individuals may lead to poorer adherence and subsequently higher rates of resistance development. Methods: In a large cohort of HIV-positive individuals, we investigated the emergence of new resistance mutations upon virological treatment failure according to the CD4 count at the initiation of cART. Results: Of 7918 included individuals, 6514 (82.3%), 996 (12.6%) and 408 (5.2%) started cART with a CD4 count ≤ 350, 351-499 and ≥ 500 cells/μL, respectively. Virological rebound occurred while on cART in 488 (7.5%), 46 (4.6%) and 30 (7.4%) with a baseline CD4 count ≤ 350, 351-499 and ≥ 500 cells/μL, respectively. Only four (13.0%) individuals with a baseline CD4 count > 350 cells/μL in receipt of a resistance test at viral load rebound were found to have developed new resistance mutations. This compared to 107 (41.2%) of those with virological failure who had initiated cART with a CD4 count < 350 cells/μL. Conclusions: We found no evidence of increased rates of resistance development when cART was initiated at CD4 counts above 350 cells/μL. HIV Medicin
    corecore