769 research outputs found

    "Lose 30lbs in 30 days" : assigning responsibility for deceptive advertising of weight-loss products

    Full text link
    Purpose &ndash; The aim of this paper is to outline key social marketing issues apparent in deceptive weight-loss advertising, from the perspective of government policy-makers, manufacturers, the media, and consumers. The purpose is to examine the complexity of one aspect of the obesity battle and provide a framework for coordinated and integrated social marketing initiatives from a multiple stakeholder perspective.Design/methodology/approach &ndash; The results of deceptive weight-loss advertising are framed using the harm chain model, and the paper offers recommended solutions based on a framework of marketing, education and policy changes across the network of stakeholders.Findings &ndash; This paper concludes that a resolution to the harm created by deceptive weight-loss advertising can be achieved by the creation of a more holistic, system-wide solution to this important health and policy issue. This networked approach must involve all aspects of harm in a multi-stakeholder solution, including both upstream and downstream integration. Specific recommendations are made for policy-makers, manufacturers, the media, and consumers to achieve this goal.Social implications &ndash; From a marketing perspective, analyzing the issue of deceptive weight-loss advertising using the harm chain allows for the creation of a more holistic, system-wide solution involving stakeholders in all aspects of harm for this important health and policy issue.Originality/value &ndash; This research examines the problem of obesity and weight-loss advertising from the unique perspective of the harm chain framework. The authors make unified recommendations for various stakeholders including industry, media, government and consumers, in order to direct integrated social marketing and consumer-oriented strategies within this industry.<br /

    Dissection of Complex Genetic Correlations into Interaction Effects

    Get PDF
    Living systems are overwhelmingly complex and consist of many interacting parts. Already the quantitative characterization of a single human cell type on genetic level requires at least the measurement of 20000 gene expressions. It remains a big challenge for theoretical approaches to discover patterns in these signals that represent specific interactions in such systems. A major problem is that available standard procedures summarize gene expressions in a hard-to-interpret way. For example, principal components represent axes of maximal variance in the gene vector space and thus often correspond to a superposition of multiple different gene regulation effects (e.g. I.1.4). Here, a novel approach to analyze and interpret such complex data is developed (Chapter II). It is based on an extremum principle that identifies an axis in the gene vector space to which as many as possible samples are correlated as highly as possible (II.3). This axis is maximally specific and thus most probably corresponds to exactly one gene regulation effect, making it considerably easier to interpret than principle components. To stabilize and optimize effect discovery, axes in the sample vector space are identified simultaneously. Genes and samples are always handled symmetrically by the algorithm. While sufficient for effect discovery, effect axes can only linearly approximate regulation laws. To represent a broader class of nonlinear regulations, including saturation effects or activity thresholds (e.g. II.1.1.2), a bimonotonic effect model is defined (II.2.1.2). A corresponding regression is realized that is monotonic over projections of samples (or genes) onto discovered gene (or sample) axes. Resulting effect curves can approximate regulation laws precisely (II.4.1). This enables the dissection of exclusively the discovered effect from the signal (II.4.2). Signal parts from other potentially overlapping effects remain untouched. This continues iteratively. In this way, the high-dimensional initial signal (II.2.1.1) can be dissected into highly specific effects. Method validation demonstrates that superposed effects of various size, shape and signal strength can be dissected reliably (II.6.2). Simulated laws of regulation are reconstructed with high correlation. Detection limits, e.g. for signal strength or for missing values, lie above practical requirements (II.6.4). The novel approach is systematically compared with standard procedures such as principal component analysis. Signal dissection is shown to have clear advantages, especially for many overlapping effects of comparable size (II.6.3). An ideal test field for such approaches is cancer cells, as they may be driven by multiple overlapping gene regulation networks that are largely unknown. Additionally, quantification and classification of cancer cells by their particular set of driving gene regulations is a prerequisite towards precision medicine. To validate the novel method against real biological data, it is applied to gene expressions of over 1000 tumor samples from Diffuse Large B-Cell Lymphoma (DLBCL) patients (Chapter III). Two already known subtypes of this disease (cf. I.1.2.1) with significantly different survival following the same chemotherapy were originally also discovered as a gene expression effect. These subtypes can only be precisely determined by this effect on molecular level. Such previous results offer a possibility for method validation and indeed, this effect has been unsupervisedly rediscovered (III.3.2.2). Several additional biologically relevant effects have been discovered and validated across four patient cohorts. Multivariate analyses (III.2) identify combinations of validated effects that can predict significant differences in patient survival. One novel effect possesses an even higher predictive value (cf. III.2.5.1) than the rediscovered subtype effect and is genetically more specific (cf. III.3.3.1). A trained and validated Cox survival model (III.2.5) can predict significant survival differences within known DLBCL subtypes (III.2.5.6), demonstrating that they are genetically heterogeneous as well. Detailed biostatistical evaluations of all survival effects (III.3.3) may help to clarify the molecular pathogenesis of DLBCL. Furthermore, the applicability of signal dissection is not limited to biological data. For instance, dissecting spectral energy distributions of stars observed in astrophysics might be useful to discover laws of light emission

    Social value measurement and nonprofit organizations: preliminary views of nonprofit and foundation managers

    Full text link
    This paper examines how managers of nonprofit organizations and foundations view the measurement of the social value of these organizations. In exploratory interviews, we found that the managers generally agree that objective measures are desired where and when possible, but recognise the difficulties in developing an assessment that enables comparisons across the nonprofit sector. These difficulties, as well as the implications for developing assessments of social value for nonprofit organizations, are discussed<br /

    Towards a Formalism-Based Toolkit for Automotive Applications

    Full text link
    The success of a number of projects has been shown to be significantly improved by the use of a formalism. However, there remains an open issue: to what extent can a development process based on a singular formal notation and method succeed. The majority of approaches demonstrate a low level of flexibility by attempting to use a single notation to express all of the different aspects encountered in software development. Often, these approaches leave a number of scalability issues open. We prefer a more eclectic approach. In our experience, the use of a formalism-based toolkit with adequate notations for each development phase is a viable solution. Following this principle, any specific notation is used only where and when it is really suitable and not necessarily over the entire software lifecycle. The approach explored in this article is perhaps slowly emerging in practice - we hope to accelerate its adoption. However, the major challenge is still finding the best way to instantiate it for each specific application scenario. In this work, we describe a development process and method for automotive applications which consists of five phases. The process recognizes the need for having adequate (and tailored) notations (Problem Frames, Requirements State Machine Language, and Event-B) for each development phase as well as direct traceability between the documents produced during each phase. This allows for a stepwise verification/validation of the system under development. The ideas for the formal development method have evolved over two significant case studies carried out in the DEPLOY project

    SystĂšme d'aide Ă  l'accĂšs lexical : trouver le mot qu'on a sur le bout de la langue

    No full text
    International audienceThe study of the Tip of the Tongue phenomenon (TOT) provides valuable clues and insights concerning the organisation of the mental lexicon (meaning, number of syllables, relation with other words, etc.). This paper describes a tool based on psycho-linguistic observations concerning the TOT phenomenon. We've built it to enable a speaker/writer to find the word he is looking for, word he may know, but which he is unable to access in time. We try to simulate the TOT phenomenon by creating a situation where the system knows the target word, yet is unable to access it. In order to find the target word we make use of the paradigmatic and syntagmatic associations stored in the linguistic databases. Our experiment allows the following conclusion: a tool like SVETLAN, capable to structure (automatically) a dictionary by domains can be used sucessfully to help the speaker/writer to find the word he is looking for, if it is combined with a database rich in terms of paradigmatic links like EuroWordNet

    Exploring US consumers understanding of carbon offsets

    Full text link
    This study found the relationship between general environment knowledge and carbon offsets knowledge is inversely related and no significant differences in general environment or carbon offset behavior exists between levels of knowledge. The findings lend support that consumers may misunderstand \u27carbon offset\u27 claims thus public policy intervention is required

    Trait coordination in boreal mosses reveals a bryophyte economics spectrum

    Get PDF
    The study of plant trait spectra and their association with trade-offs in resource use strategy has greatly advanced our understanding of vascular plant function, yet trait spectra remain poorly studied in bryophytes, particularly outside of the Sphagnum genus. Here, we measured 25 traits related to carbon, nutrient and water conservation in 60 moss canopies (each dominated by one of 15 moss species) across diverse boreal forest habitats and used bi-variate correlations and multi-variate analyses to assess trait coordination and trait spectra. We found substantial trait coordination along a main principal components axis driven by trade-offs in carbon, nutrient and water conservation strategies. Along this trait spectrum, traits varied from resource-acquisitive at one end (e.g. high maximum photosynthetic capacity, high tissue nitrogen content, low water-holding capacity) to resource-conservative at the other end, in line with resource economics theory. Traits related to carbon turnover (photosynthesis and respiration rates, litter decomposability) were positively related to nitrogen content and to desiccation rates, in line with global trait spectra in vascular plants. However, architectural traits of the moss shoots and of the moss canopy were generally unrelated to the main axis of trait variation and formed a secondary axis of trait variation, contrary to what is observed for vascular plants. Resource-conservative trait spectra dominated in moss canopies from open and wet habitats (i.e. mires), indicating that high irradiance and possibly high moisture fluctuation induce a resource-conservative trait strategy in mosses. Synthesis. Our work suggests that trait relationships that are well established for vascular plants can be extended for bryophytes as well. Bryophyte trait spectra can be powerful tools to improve our understanding of ecosystem processes in moss-dominated ecosystems, such as boreal or arctic environments, where bryophyte communities exert strong control on nutrient and carbon cycling

    Eine Analyse von Food-Wertschöpfungsketten auf Basis internationaler Vergleichsdaten und Fallstudien

    Get PDF
    In der Schweiz zahlte der Verbraucher im Jahr 2015 fĂŒr einen identisch gewichteten Warenkorb an Food-Produkten im Durchschnitt 45 Prozent mehr als in den vier NachbarlĂ€ndern. Der grösste Preisunterschied existiert bei Fleischprodukten (+85%). FĂŒr den gesamten KonsumgĂŒterwarenkorb ergibt sich ein Preisaufschlag von 29 Prozent. Nahrungs- und Genussmittel sind in der Schweiz also im Vergleich zu anderen KonsumgĂŒtern ĂŒberproportional teuer. Im Vergleich mit Dienstleistungen (die in der Schweiz 72% teurer sind als in den NachbarlĂ€ndern) gilt das Gegenteil. In der Agrarpolitik stellt sich vor dem Hintergrund der hohen Nahrungsmittelpreise die Frage, ob die preisliche WettbewerbsfĂ€higkeit der Schweizer AgrarmĂ€rkte verbessert werden kann. Die vorliegende Studie geht dieser Frage mit einer Analyse der inlĂ€ndischen Food-Wertschöpfungskette nach. Das Forschungsdesign wurde mit dem Ziel entwickelt, die Funktionsweise und Handelspraktiken von Schweizer AgrarmĂ€rkten hinsichtlich Marktkonzentrationen und -asymmetrien zu evaluieren und mögliche ZusammenhĂ€nge zwischen Marktstruktur und Marktergebnis (Konsumentenpreise) empirisch zu untersuchen

    Detection and Analysis of Gasoline Residues on Household Samples by Using Gas Chromatography

    Get PDF
    Arson is an easy to commit crime. The required supplies are easily and cheaply available to the general public and no special knowledge such as hacking and marksmanship is required. It is also effective, as fires can quickly cause huge monetary damages and loss of life. Arson can be problematic to forensic investigators, as the nature of flames can destroy evidence such as fingerprints and hair at the scene of the crime. In nearly all cases of arson a liquid accelerant is used. Liquid accelerants, such as gasoline, speed up the process and increase the damage done during the time. Our research focuses on the detection of gasoline residues on different household samples over various intervals of time. For this goal, regular unleaded regular gasoline (87 octane number) and four common household materials: carpet, plywood, newspaper, and cotton fabric were chosen. These household samples were cut into the same size pieces (1.5 x 1.5 cm2). 50 ”L of gasoline was splashed on them, and they were dried at room temperature for various time intervals (10 min, 30 min, 1 h, 2 h, 4 h, 6 h, 12 h, 24 h, and 48 h) prior to chemical analysis. Detection of gasoline residues from these samples were conducted via gas chromatography with headspace sampling and flame ionization detector. Our preliminary data has shown that a trace amount of gasoline was identified from a cotton fabric sample even after 48 hours of staying at room temperature. Key Words: arson, gas chromatography, chemistry, gasoline, forensic scienc
    • 

    corecore