197 research outputs found
Surely you don’t eat parsnip skins? Categorising the edibility of food waste
Food that is either wasted or lost, rather than being eaten, accounts for around a third of global food production and is linked to several environmental, economic and social issues. The reliable quantification of this wasted food is essential to monitor progress towards the United Nations’ Sustainable Development Goal 12.3, which covers food loss and waste. Currently quantification of food waste is made difficult by many differing definitions, some of which require categorisation of food items into those parts considered edible and those considered inedible. Edibility is difficult to define as it is affected by cultural and social influences. This study presents a novel, easily-replicable, questionnaire-based methodology to categorise ‘borderline’ food items thrown away from households, e.g. parsnip skin, apple cores. The methodology captures self-reported information on what people eat (self-reported consumption) and their perceptions of edibility. Our results for the United Kingdom indicate that, for a given food ‘part’, there is divergence between individuals’ responses to the survey questions: e.g. many people would ‘never’ eat carrot skins, whilst many others would ‘always’ eat them. Furthermore, there is a systematic difference between people’s self-reported consumption and their perceptions of edibility. We suggest that both need to be considered to create a balanced categorisation of edible and inedible parts; we propose a method for incorporating both elements. Within this method, a threshold needs to be applied and the resultant classification, especially of those items close to this threshold, will inevitably be contentious. Despite this, the categorisation of what is considered edible using this methodology reflects the views of the majority of the population, facilitating the quantification of food waste. In addition, we envisage this methodology can be used to compare geographical differences and track changes over time with regard to edibility
Recommended from our members
Review: Consumption-stage food waste reduction interventions - What works and how to design better interventions
Food waste prevention has become an issue of international concern, with Sustainable Development Goal 12.3 aiming to halve per capita global food waste at the retail and consumer levels by 2030. However there is no review that has considered the effectiveness of interventions aimed at preventing food waste in the consumption stages of the food system. This significant gap, if filled, could help support those working to reduce food waste in the developed world, providing knowledge of what interventions are specifically effective at preventing food waste.
This paper fills this gap, identifying and summarizing food-waste prevention interventions at the consumption/consumer stage of the supply chain via a rapid review of global academic literature from 2006 to 2017.
We identify 17 applied interventions that claim to have achieved food waste reductions. Of these, 13 quantified food waste reductions. Interventions that changed the size or type of plates were shown to be effective (up to 57% food waste reduction) in hospitality environments. Changing nutritional guidelines in schools were reported to reduce vegetable waste by up to 28%, indicating that healthy diets can be part of food waste reduction strategies. Information campaigns were also shown to be effective with up to 28% food waste reduction in a small sample size intervention.
Cooking classes, fridge cameras, food sharing apps, advertising and information sharing were all reported as being effective but with little or no robust evidence provided. This is worrying as all these methods are now being proposed as approaches to reduce food waste and, except for a few studies, there is no reproducible quantified evidence to assure credibility or success. To strengthen current results, a greater number of longitudinal and larger sample size intervention studies are required. To inform future intervention studies, this paper proposes a standardised guideline, which consists of: (1) intervention design; (2) monitoring and measurement; (3) moderation and mediation; (4) reporting; (5) systemic effects.
Given the importance of food-waste reduction, the findings of this review highlight a significant evidence gap, meaning that it is difficult to make evidence-based decisions to prevent or reduce consumption-stage food waste in a cost-effective manner
A Comparative Study of Three Different Types of Stem Cells for Treatment of Rat Spinal Cord Injury
Three different sources of human stem cells-bone marrow-derived mesenchymal stem cells (BM-MSCs), neural progenitors (NPs) derived from immortalized spinal fetal cell line (SPC-01), and induced pluripotent stem cells (iPSCs)-were compared in the treatment of a balloon-induced spinal cord compression lesion in rats. One week after lesioning, the rats received either BM-MSCs (intrathecally) or NPs (SPC-01 cells or iPSC-NPs, both intraspinally), or saline. The rats were assessed for their locomotor skills (BBB, flat beam test, and rotarod). Morphometric analyses of spared white and gray matter, axonal sprouting, and glial scar formation, as well as qPCR and Luminex assay, were conducted to detect endogenous gene expression, while inflammatory cytokine levels were performed to evaluate the host tissue response to stem cell therapy. The highest locomotor recovery was observed in iPSC-NP-grafted animals, which also displayed the highest amount of preserved white and gray matter. Grafted iPSC-NPs and SPC-01 cells significantly increased the number of growth-associated protein 43 (GAP43+) axons, reduced astrogliosis, downregulated Casp3 expression, and increased IL-6 and IL-12 levels. hMSCs transiently decreased levels of inflammatory IL-2 and TNF-alpha. These findings correlate with the short survival of hMSCs, while NPs survived for 2 months and matured slowly into glia- and tissue-specific neuronal precursors. SPC-01 cells differentiated more in astroglial phenotypes with a dense structure of the implant, whereas iPSC-NPs displayed a more neuronal phenotype with a loose structure of the graft. We concluded that the BBB scores of iPSC-NP- and hMSC-injected rats were superior to the SPC-01-treated group. The iPSC-NP treatment of spinal cord injury (SCI) provided the highest recovery of locomotor function due to robust graft survival and its effect on tissue sparing, reduction of glial scarring, and increased axonal sprouting
A Standardised Procedure for Evaluating Creative Systems: Computational Creativity Evaluation Based on What it is to be Creative
Computational creativity is a flourishing research area, with a variety of creative systems being produced and developed. Creativity evaluation has not kept pace with system development with an evident lack of systematic evaluation of the creativity of these systems in the literature. This is partially due to difficulties in defining what it means for a computer to be creative; indeed, there is no consensus on this for human creativity, let alone its computational equivalent. This paper proposes a Standardised Procedure for Evaluating Creative Systems (SPECS). SPECS is a three-step process: stating what it means for a particular computational system to be creative, deriving and performing tests based on these statements. To assist this process, the paper offers a collection of key components of creativity, identified empirically from discussions of human and computational creativity. Using this approach, the SPECS methodology is demonstrated through a comparative case study evaluating computational creativity systems that improvise music
Survey of the general public's attitudes toward advance directives in Japan: How to respect patients' preferences
BACKGROUND: Japanese people have become increasingly interested in the expression and enhancement of their individual autonomy in medical decisions made regarding medical treatment at and toward the end of life. However, while many Western countries have implemented legislation that deals with patient autonomy in the case of terminal illness, no such legislation exists in Japan. The rationale for this research is based on the need to investigate patient's preferences regarding treatment at the end of life in order to re-evaluate advance directives policy and practice. METHODS: We conducted a cross-sectional survey with 418 members of the general middle-aged and senior adults (aged between 40 and 65) in Tokyo, Japan. Respondents were asked about their attitudes toward advance directives, and preferences toward treatment options. RESULTS: Over 60% of respondents agreed that it is better to express their wishes regarding advance directives (treatment preferences in writing, appointment of proxy for care decision making, appointment of legal administrator of property, stating preferences regarding disposal of one's property and funeral arrangements) but less than 10% of them had already done so. About 60% of respondents in this study preferred to indicate treatment preferences in broad rather than concrete terms. Over 80% would like to decide treatment preferences in consultation with others (22.2% with their proxy, 11.0% with the doctor, and 47.8% with both their proxy and the doctor). CONCLUSION: This study revealed that many Japanese people indicate an interest in undertaking advance directives. This study found that there is a range of preferences regarding how advance directives are undertaken, thus it is important to recognize that any processes put into place should allow flexibility in order to best respect patients' wishes and autonomy
Characteristics of Nondisabled Older Patients Developing New Disability Associated with Medical Illnesses and Hospitalization
OBJECTIVE: To identify demographic, clinical, and biological characteristics of older nondisabled patients who develop new disability in basic activities of daily living (BADL) during medical illnesses requiring hospitalization. DESIGN: Longitudinal observational study. SETTING: Geriatric and Internal Medicine acute care units. PARTICIPANTS: Data are from 1,686 patients aged 65 and older who independent in BADL 2 weeks before hospital admission, enrolled in the 1998 survey of the Italian Group of Pharmacoepidemiology in the Elderly Study. MEASUREMENTS: Study outcome was new BADL disability at time of hospital discharge. Sociodemographic, functional status, and clinical characteristics were collected at hospital admission; acute and chronic conditions were classified according to the International Classification of Disease, ninth revision; fasting blood samples were obtained and processed with standard methods. RESULTS: At the time of hospital discharge 113 patients (6.7%) presented new BADL disability. Functional decline was strongly related to patients’ age and preadmission instrumental activities of daily living status. In a multivariate analysis, older age, nursing home residency, low body mass index, elevated erythrocyte sedimentation rate, acute stroke, high level of comorbidity expressed as Cumulative Illness Rating Scale score, polypharmacotherapy, cognitive decline, and history of fall in the previous year were independent and significant predictors of BADL disability. CONCLUSION: Several factors might contribute to loss of physical independence in hospitalized older persons. Preexisting conditions associated with the frailty syndrome, including physical and cognitive function, comorbidity, body composition, and inflammatory markers, characterize patients at high risk of functional decline
- …