213 research outputs found

    Increasing vegetable intakes: rationale and systematic review of published interventions

    Get PDF
    Purpose While the health benefits of a high fruit and vegetable consumption are well known and considerable work has attempted to improve intakes, increasing evidence also recognises a distinction between fruit and vegetables, both in their impacts on health and in consumption patterns. Increasing work suggests health benefits from a high consumption specifically of vegetables, yet intakes remain low, and barriers to increasing intakes are prevalent making intervention difficult. A systematic review was undertaken to identify from the published literature all studies reporting an intervention to increase intakes of vegetables as a distinct food group. Methods Databases—PubMed, PsychInfo and Medline—were searched over all years of records until April 2015 using pre-specified terms. Results Our searches identified 77 studies, detailing 140 interventions, of which 133 (81 %) interventions were conducted in children. Interventions aimed to use or change hedonic factors, such as taste, liking and familiarity (n = 72), use or change environmental factors (n = 39), use or change cognitive factors (n = 19), or a combination of strategies (n = 10). Increased vegetable acceptance, selection and/or consumption were reported to some degree in 116 (83 %) interventions, but the majority of effects seem small and inconsistent. Conclusions Greater percent success is currently found from environmental, educational and multi-component interventions, but publication bias is likely, and long-term effects and cost-effectiveness are rarely considered. A focus on long-term benefits and sustained behaviour change is required. Certain population groups are also noticeably absent from the current list of tried interventions

    Long-term effects of cranial irradiation and intrathecal chemotherapy in treatment of childhood leukemia: a MEG study of power spectrum and correlated cognitive dysfunction

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Prophylaxis to prevent relapses in the central nervous system after childhood acute lymphoblastic leukemia (ALL) used to consist of both intrathecal chemotherapy (CT) and cranial irradiation (CRT). CRT was mostly abolished in the eighties because of its neurotoxicity, and replaced with more intensive intrathecal CT. In this study, a group of survivors treated with CRT before 1983 and another group treated without CRT thereafter are investigated 20–25 years later, giving a much stronger perspective on long-term quality of life than previous studies. The outcomes will help to better understand these groups’ current needs and will aid in anticipating late effects of prophylactic CRT that is currently applied for other diseases. This study evaluates oscillatory neuronal activity in these long-term survivors. Power spectrum deviations are hypothesized to correlate with cognitive dysfunction.</p> <p>Methods</p> <p>Resting state eyes-closed magnetoencephalography (MEG) recordings were obtained from 14 ALL survivors treated with CT + CRT, 18 treated with CT alone and 35 controls. Relative spectral power was calculated in the δ, θ, α1, α2, β and γ frequency bands. The Amsterdam Neuropsychological Tasks (ANT) program was used to assess cognition in the executive functions domain. MEG data and ANT scores were correlated.</p> <p>Results</p> <p>In the CT + CRT group, relative θ power was slightly increased (p = 0.069) and α2 power was significantly decreased (p = 0.006). The CT + CRT group performed worse on various cognitive tests. A deficiency in visuomotor accuracy, especially of the right hand, could be clearly associated with the deviating regional θ and α2 powers (0.471 < r < 0.697). A significant association between decreased regional α2 power and less attentional fluctuations was found for CT + CRT patients as well as controls (0.078 < r < 0.666). Patients treated with CT alone displayed a power spectrum similar to controls, except for a significantly increased level of left frontal α2 power (p = 0.030).</p> <p>Conclusions</p> <p>The tendency towards global slowing of brain oscillatory activity, together with the fact that dementia has been reported as a late effect of CRT and the neuropsychological deficiencies currently present, suggest that the irradiated brain might be aging faster and could be at risk for early‐onset dementia. The CT group showed no signs of early aging.</p

    Population‐based cohort study of outcomes following cholecystectomy for benign gallbladder diseases

    Get PDF
    Background The aim was to describe the management of benign gallbladder disease and identify characteristics associated with all‐cause 30‐day readmissions and complications in a prospective population‐based cohort. Methods Data were collected on consecutive patients undergoing cholecystectomy in acute UK and Irish hospitals between 1 March and 1 May 2014. Potential explanatory variables influencing all‐cause 30‐day readmissions and complications were analysed by means of multilevel, multivariable logistic regression modelling using a two‐level hierarchical structure with patients (level 1) nested within hospitals (level 2). Results Data were collected on 8909 patients undergoing cholecystectomy from 167 hospitals. Some 1451 cholecystectomies (16·3 per cent) were performed as an emergency, 4165 (46·8 per cent) as elective operations, and 3293 patients (37·0 per cent) had had at least one previous emergency admission, but had surgery on a delayed basis. The readmission and complication rates at 30 days were 7·1 per cent (633 of 8909) and 10·8 per cent (962 of 8909) respectively. Both readmissions and complications were independently associated with increasing ASA fitness grade, duration of surgery, and increasing numbers of emergency admissions with gallbladder disease before cholecystectomy. No identifiable hospital characteristics were linked to readmissions and complications. Conclusion Readmissions and complications following cholecystectomy are common and associated with patient and disease characteristics

    Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.

    Get PDF
    Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability

    Development and evaluation of a quality score for abstracts

    Get PDF
    BACKGROUND: The evaluation of abstracts for scientific meetings has been shown to suffer from poor inter observer reliability. A measure was developed to assess the formal quality of abstract submissions in a standardized way. METHODS: Item selection was based on scoring systems for full reports, taking into account published guidelines for structured abstracts. Interrater agreement was examined using a random sample of submissions to the American Gastroenterological Association, stratified for research type (n = 100, 1992–1995). For construct validity, the association of formal quality with acceptance for presentation was examined. A questionnaire to expert reviewers evaluated sensibility items, such as ease of use and comprehensiveness. RESULTS: The index comprised 19 items. The summary quality scores showed good interrater agreement (intra class coefficient 0.60 – 0.81). Good abstract quality was associated with abstract acceptance for presentation at the meeting. The instrument was found to be acceptable by expert reviewers. CONCLUSION: A quality index was developed for the evaluation of scientific meeting abstracts which was shown to be reliable, valid and useful

    The influence of different helminth infection phenotypes on immune responses against HIV in co-infected adults in South Africa

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The convergent distribution of the Human Immunodeficiency Virus (HIV) and helminth infections has led to the suggestion that infection with helminths exacerbates the HIV epidemic in developing countries. In South Africa, it is estimated that 57% of the population lives in poverty and carries the highest burden of both HIV and helmith infections, however, the disease interactions are under-researched.</p> <p>Methods</p> <p>We employed both coproscopy and <it>Ascaris lumbricoides</it>-specific serum IgE to increase diagnostic sensitivity and to distinguish between different helminth infection phenotypes and their effects on immune responses in HIV co-infected individuals. Coproscopy was done by formol ether and Kato Katz methods. HIV positive and negative adults were stratified according to the presence or absence of <it>A. lumbricoides </it>and/or <it>Trichuris trichuria </it>eggs with or without elevated <it>Ascaris </it>IgE. Lymphocyte subsets were phenotyped by flow cytometry. Viral loads, serum total IgE and eosinophils were also analysed. Lymphocyte activation markers (CCR5, HLA-DR, CD25, CD38 and CD71) were determined. Non parametric statistics were used to describe differences in the variables between the subgroups.</p> <p>Results</p> <p>Helminth prevalence ranged between 40%-60%. Four distinct subgroups of were identified, and this included egg positive/high <it>Ascaris</it>-specific IgE (egg<sup>+</sup>IgE<sup>hi</sup>), egg positive/low IgE (egg<sup>+</sup>IgE<sup>lo</sup>), egg negative/high IgE (egg<sup>-</sup>IgE<sup>hi</sup>) and egg negative/low IgE (egg<sup>-</sup>IgE<sup>lo</sup>) individuals. The egg<sup>+</sup>IgE<sup>hi </sup>subgroup displayed lymphocytopenia, eosinophilia, (low CD4<sup>+ </sup>counts in HIV<sup>- </sup>group), high viral load (in HIV<sup>+ </sup>group), and an activated lymphocyte profile. High <it>Ascaris </it>IgE subgroups (egg<sup>+</sup>IgE<sup>hi </sup>and egg<sup>-</sup>IgE<sup>hi</sup>) had eosinophilia, highest viral loads, and lower CD4<sup>+ </sup>counts in the HIV<sup>- </sup>group). Egg excretion and low IgE (egg<sup>+</sup>IgE<sup>lo</sup>) status demonstrated a modified Th<sub>2 </sub>immune profile with a relatively competent response to HIV.</p> <p>Conclusions</p> <p>People with both helminth egg excretion and high <it>Ascaris</it>-IgE levels had dysregulated immune cells, high viral loads with more immune activation. A modified Th<sub>2 </sub>helminth response in individuals with egg positive stools and low <it>Ascaris </it>IgE showed a better HIV related immune profile. Future research on helminth-HIV co-infection should include parasite-specific IgE measurements in addition to coproscopy to delineate the different response phenotypes. Helminth infection affects the immune response to HIV in some individuals with high IgE and egg excretion in stool.</p

    Prolonged Application of High Fluid Shear to Chondrocytes Recapitulates Gene Expression Profiles Associated with Osteoarthritis

    Get PDF
    BACKGROUND: Excessive mechanical loading of articular cartilage producing hydrostatic stress, tensile strain and fluid flow leads to irreversible cartilage erosion and osteoarthritic (OA) disease. Since application of high fluid shear to chondrocytes recapitulates some of the earmarks of OA, we aimed to screen the gene expression profiles of shear-activated chondrocytes and assess potential similarities with OA chondrocytes. METHODOLOGY/PRINCIPAL FINDINGS: Using a cDNA microarray technology, we screened the differentially-regulated genes in human T/C-28a2 chondrocytes subjected to high fluid shear (20 dyn/cm(2)) for 48 h and 72 h relative to static controls. Confirmation of the expression patterns of select genes was obtained by qRT-PCR. Using significance analysis of microarrays with a 5% false discovery rate, 71 and 60 non-redundant transcripts were identified to be ≥2-fold up-regulated and ≤0.6-fold down-regulated, respectively, in sheared chondrocytes. Published data sets indicate that 42 of these genes, which are related to extracellular matrix/degradation, cell proliferation/differentiation, inflammation and cell survival/death, are differentially-regulated in OA chondrocytes. In view of the pivotal role of cyclooxygenase-2 (COX-2) in the pathogenesis and/or progression of OA in vivo and regulation of shear-induced inflammation and apoptosis in vitro, we identified a collection of genes that are either up- or down-regulated by shear-induced COX-2. COX-2 and L-prostaglandin D synthase (L-PGDS) induce reactive oxygen species production, and negatively regulate genes of the histone and cell cycle families, which may play a critical role in chondrocyte death. CONCLUSIONS/SIGNIFICANCE: Prolonged application of high fluid shear stress to chondrocytes recapitulates gene expression profiles associated with osteoarthritis. Our data suggest a potential link between exposure of chondrocytes/cartilage to abnormal mechanical loading and the pathogenesis/progression of OA

    Change & Maintaining Change in School Cafeterias: Economic and Behavioral-Economic Approaches to Increasing Fruit and Vegetable Consumption

    Get PDF
    Developing a daily habit of consuming fruits and vegetables (FV) in children is an important public-health goal. Eating habits acquired in childhood are predictive of adolescent and adult dietary patterns. Thus, healthy eating patterns developed early in life can protect the individual against a number of costly health deficits and may reduce the prevalence of obesity. At present, children in the United States (US) under-consume FV despite having access to them through the National School Lunch Program. Because access is an obstacle to developing healthy eating habits, particularly in low-income households, targeting children’s FV consumption in schools has the advantage of near-universal FV availability among more than 30 million US children. This chapter reviews economic and behavioral-economic approaches to increasing FV consumption in schools. Inclusion criteria include objective measurement of FV consumption (e.g., plate waste measures) and minimal demand characteristics. Simple but effective interventions include (a) increasing the variety of vegetables served, (b) serving sliced instead of whole fruits, (c) scheduling lunch after recess, and (d) giving children at least 25 minutes to eat. Improving the taste of FV and short-term incentivizing consumption of gradually increasing amounts can produce large increases in consumption of these foods. Low-cost game-based incentive program may increase the practicality of the latter strategy
    corecore