261 research outputs found

    Serving Fruits and Vegetables in Kid-Friendly Shapes Increased Fruit and Vegetable Consumption in Preschool Children Aged 2-5 Years

    Get PDF
    Background. With childhood obesity rising, and taste preferences being influenced at an early age, it’s more important now than ever to encourage children to eat right. Objective. To find whether shaping fruits and vegetables (F&Vs) into kid-friendly designs would increase consumption in preschool aged children. Design. The four week long observational study took place at Loma Linda Children Academy in two stages during their normal lunch time hour. Baseline data was recorded during the first two weeks, and a month later intervention data was collected during the last two weeks. The menus served remained exactly the same except for the shapes of the F&Vs were cut into one of our eight kid-friendly designs during intervention weeks. Participants. A convenience sample of healthy preschool aged children (both male and female) between ages 2 and 5 were served hot lunch from the food service provider of the school, serving approximately 30 students per day, Monday through Thursday. Children with allergies were excluded from the study. Main outcome measures. Daily measurements were pooled by age group (2-3, 3-4, 4-5) and gender, which generated approximately 6 data points daily to be used for statistical comparisons. This translates into a sample size of 46 during baseline weeks and 45 for intervention weeks, for a total of 91 data points. Data was analyzed looking at F&V consumed/total F&V served and reported as a percentage. Statistical analysis performed. Results were calculated using an independent T-test and 3-way ANOVA comparison between variables. Results. Overall, displaying F&Vs into kid-friendly shapes increased the preschoolers’ intake regardless of age and gender by 10.8% when compared to unshaped F&Vs (p =.02). Conclusion. The results of this study may be helpful to parents and caregivers dealing with picky eaters and may be applied to school food service programs to increase F&Vs selections, displace empty calories, and ultimately produce overall benefits to the young, yet aging population

    Semi-automated surface water detection with synthetic aperture radar data: A wetland case study

    Get PDF
    In this study, a new method is proposed for semi-automated surface water detection using synthetic aperture radar data via a combination of radiometric thresholding and image segmentation based on the simple linear iterative clustering superpixel algorithm. Consistent intensity thresholds are selected by assessing the statistical distribution of backscatter values applied to the mean of each superpixel. Higher-order texture measures, such as variance, are used to improve accuracy by removing false positives via an additional thresholding process used to identify the boundaries of water bodies. Results applied to quad-polarized RADARSAT-2 data show that the threshold value for the variance texture measure can be approximated using a constant value for different scenes, and thus it can be used in a fully automated cleanup procedure. Compared to similar approaches, errors of omission and commission are improved with the proposed method. For example, we observed that a threshold-only approach consistently tends to underestimate the extent of water bodies compared to combined thresholding and segmentation, mainly due to the poor performance of the former at the edges of water bodies. The proposed method can be used for monitoring changes in surface water extent within wetlands or other areas, and while presented for use with radar data, it can also be used to detect surface water in optical images

    Moving to the RADARSAT Constellation Mission: Comparing Synthesized Compact Polarimetry and Dual Polarimetry Data with Fully Polarimetric RADARSAT-2 Data for Image Classification of Peatlands

    Get PDF
    For this research, the Random Forest (RF) classifier was used to evaluate the potential of simulated RADARSAT Constellation Mission (RCM) data for mapping landcover within peatlands. Alfred Bog, a large peatland complex in Southern Ontario, was used as a test case. The goal of this research was to prepare for the launch of the upcoming RCM by evaluating three simulated RCM polarizations for mapping landcover within peatlands. We examined (1) if a lower RCM noise equivalent sigma zero (NESZ) affects classification accuracy, (2) which variables are most important for classification, and (3) whether classification accuracy is affected by the use of simulated RCM data in place of the fully polarimetric RADARSAT-2. Results showed that the two RCM NESZs (−25 dB and −19 dB) and three polarizations (compact polarimetry, HH+HV, and VV+VH) that were evaluated were all able to achieve acceptable classification accuracies when combined with optical data and a digital elevation model (DEM). Optical variables were consistently ranked to be the most important for mapping landcover within peatlands, bu

    A Systematic Approach for Variable Selection With Random Forests: Achieving Stable Variable Importance Values

    Get PDF
    Random Forests variable importance measures are often used to rank variables by their relevance to a classification problem and subsequently reduce the number of model inputs in high-dimensional data sets, thus increasing computational efficiency. However, as a result of the way that training data and predictor variables are randomly selected for use in constructing each tree and splitting each node, it is also well known that if too few trees are generated, variable importance rankings tend to differ between model runs. In this letter, we characterize the effect of the number of trees (ntree) and class separability on the stability of variable importance rankings and develop a systematic approach to define the number of model runs and/or trees required to achieve stability in variable importance measures. Results demonstrate that both a large ntree for a single model run, or averaged values across multiple model runs with fewer trees, are sufficient for achieving stable mean importance values. While the latter is far more computationally efficient, both the methods tend to lead to the same ranking of variables. Moreover, the optimal number of model runs differs depending on the separability of classes. Recommendations are made to users regarding how to determine the number of model runs and/or trees that are required to achieve stable variable importance rankings

    Patterns of abundance across geographical ranges as a predictor for responses to climate change:Evidence from UK rocky shores

    Get PDF
    Aim: Understanding patterns in the abundance of species across thermal ranges can give useful insights into the potential impacts of climate change. The abundant-centre hypothesis suggests that species will reach peak abundance at the centre of their thermal range where conditions are optimal, but evidence in support of this hypothesis is mixed and limited in geographical and taxonomic scope. We tested the applicability of the abundant-centre hypothesis across a range of intertidal organisms using a large, citizen science-generated data set. Location: UK. Methods: Species' abundance records were matched with their location within their thermal range. Patterns in abundance distribution for individual species, and across aggregated species abundances, were analysed using Kruskal–Wallis tests and quantile general additive models. Results: Individually, invertebrate species showed increasing abundances in the cooler half of the thermal range and decreasing abundances in the warmer half of the thermal range. The overall shape for aggregated invertebrate species abundances reflected a broad peak, with a cool-skewed maximum abundance. Algal species showed little evidence for an abundant-centre distribution individually, but overall the aggregated species abundances suggested a hump-backed abundance distribution. Main Conclusions: Our study follows others in showing mixed support for the abundant-centre hypothesis at an individual species level, but demonstrates an increased predictability in species responses when an aggregated overall response is considered

    Intended and unintended consequences of the implementation of minimum unit pricing of alcohol in Scotland: a natural experiment

    Get PDF
    Background: Scotland was the first country to implement minimum unit pricing for alcohol nationally. Minimum unit pricing aims to reduce alcohol-related harms and to narrow health inequalities. Minimum unit pricing sets a minimum retail price based on alcohol content, targeting products preferentially consumed by high-risk drinkers. This study comprised three components. Objectives: This study comprised three components assessing alcohol consumption and alcohol-related attendances in emergency departments, investigating potential unintended effects of minimum unit pricing on alcohol source and drug use, and exploring changes in public attitudes, experiences and norms towards minimum unit pricing and alcohol use. Design: We conducted a natural experiment study using repeated cross-sectional surveys comparing Scotland (intervention) and North England (control) areas. This involved comparing changes in Scotland following the introduction of minimum unit pricing with changes seen in the north of England over the same period. Difference-in-difference analyses compared intervention and control areas. Focus groups with young people and heavy drinkers, and interviews with professional stakeholders before and after minimum unit pricing implementation in Scotland allowed exploration of attitudes, experiences and behaviours, stakeholder perceptions and potential mechanisms of effect. Setting: Four emergency departments in Scotland and North England (component 1), six sexual health clinics in Scotland and North England (component 2), and focus groups and interviews in Scotland (component 3). Participants: Research nurses interviewed 23,455 adults in emergency departments, and 15,218 participants self-completed questionnaires in sexual health clinics. We interviewed 30 stakeholders and 105 individuals participated in focus groups. Intervention: Minimum unit pricing sets a minimum retail price based on alcohol content, targeting products preferentially consumed by high-risk drinkers. Results: The odds ratio for an alcohol-related emergency department attendance following minimum unit pricing was 1.14 (95% confidence interval 0.90 to 1.44; p = 0.272). In absolute terms, we estimated that minimum unit pricing was associated with 258 more alcohol-related emergency department visits (95% confidence interval –191 to 707) across Scotland than would have been the case had minimum unit pricing not been implemented. The odds ratio for illicit drug consumption following minimum unit pricing was 1.04 (95% confidence interval 0.88 to 1.24; p = 0.612). Concerns about harms, including crime and the use of other sources of alcohol, were generally not realised. Stakeholders and the public generally did not perceive price increases or changed consumption. A lack of understanding of the policy may have caused concerns about harms to dependent drinkers among participants from more deprived areas. Limitations: The short interval between policy announcement and implementation left limited time for pre-intervention data collection. Conclusions: Within the emergency departments, there was no evidence of a beneficial impact of minimum unit pricing. Implementation appeared to have been successful and there was no evidence of substitution from alcohol consumption to other drugs. Drinkers and stakeholders largely reported not noticing any change in price or consumption. The lack of effect observed in these settings in the short term, and the problem-free implementation, suggests that the price per unit set (£0.50) was acceptable, but may be too low. Our evaluation, which itself contains multiple components, is part of a wider programme co-ordinated by Public Health Scotland and the results should be understood in this wider context. Future work: Repeated evaluation of similar policies in different contexts with varying prices would enable a fuller picture of the relationship between price and impacts.Additional co-authors: Oarabile Molaodi, Michele Open, Chris Patterson, Samantha Perry, Thomas Phillips, Gabriel Schembri, Janet Wilson, Chris Yap, Lyndal Bond, and Alastair H Leylan

    Embodied viewing and Degas’s Little Dancer Aged Fourteen: a multi-disciplinary experiment in eye-tracking and motion capture

    Get PDF
    This paper presents a cross-disciplinary project based on an experiment in eye-tracking and motion capture (Sainsbury’s Centre for Visual Arts), which aimed to study viewers’ movements around an iconic sculpture: Edgar Degas’s Little Dancer Aged Fourteen. The experiment studies how viewers respond to this three-dimensional artwork not only by looking at it but also through their own bodily reactions to it, such as by unconsciously mimicking a represented attitude or gesture. We compared two groups of viewers: classically trained dancers and non-dancers. Our hypothesis was that the skills and embodied experiences of the dancers would alter the ways in which they engage bodily with the work compared to the non-dancers. Our underlying research question was: how are vision and the body interlinked in esthetic and kinesthetic experience? This paper does not give results, which are forthcoming. It focuses on methodology and provides a commentary on the design and development of the interdisciplinary collaboration behind the project. It explores an interdisciplinary collaboration that bridges the humanities and experimental sciences and asks how being confronted with unfamiliar methodologies forces researchers in a given field to critically self-examine the limits and presuppositions of their practices
    • …
    corecore