207,644 research outputs found

    Neurophysiological Profile of Antismoking Campaigns

    Get PDF
    Over the past few decades, antismoking public service announcements (PSAs) have been used by governments to promote healthy behaviours in citizens, for instance, against drinking before the drive and against smoke. Effectiveness of such PSAs has been suggested especially for young persons. By now, PSAs efficacy is still mainly assessed through traditional methods (questionnaires and metrics) and could be performed only after the PSAs broadcasting, leading to waste of economic resources and time in the case of Ineffective PSAs. One possible countermeasure to such ineffective use of PSAs could be promoted by the evaluation of the cerebral reaction to the PSA of particular segments of population (e.g., old, young, and heavy smokers). In addition, it is crucial to gather such cerebral activity in front of PSAs that have been assessed to be effective against smoke (Effective PSAs), comparing results to the cerebral reactions to PSAs that have been certified to be not effective (Ineffective PSAs). &e eventual differences between the cerebral responses toward the two PSA groups will provide crucial information about the possible outcome of new PSAs before to its broadcasting. &is study focused on adult population, by investigating the cerebral reaction to the vision of different PSA images, which have already been shown to be Effective and Ineffective for the promotion of an antismoking behaviour. Results showed how variables as gender and smoking habits can influence the perception of PSA images, and how different communication styles of the antismoking campaigns could facilitate the comprehension of PSA’s message and then enhance the related impac

    Allocation in Practice

    Full text link
    How do we allocate scarcere sources? How do we fairly allocate costs? These are two pressing challenges facing society today. I discuss two recent projects at NICTA concerning resource and cost allocation. In the first, we have been working with FoodBank Local, a social startup working in collaboration with food bank charities around the world to optimise the logistics of collecting and distributing donated food. Before we can distribute this food, we must decide how to allocate it to different charities and food kitchens. This gives rise to a fair division problem with several new dimensions, rarely considered in the literature. In the second, we have been looking at cost allocation within the distribution network of a large multinational company. This also has several new dimensions rarely considered in the literature.Comment: To appear in Proc. of 37th edition of the German Conference on Artificial Intelligence (KI 2014), Springer LNC

    Evaluation of second-level inference in fMRI analysis

    Get PDF
    We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference

    Automating Fault Tolerance in High-Performance Computational Biological Jobs Using Multi-Agent Approaches

    Get PDF
    Background: Large-scale biological jobs on high-performance computing systems require manual intervention if one or more computing cores on which they execute fail. This places not only a cost on the maintenance of the job, but also a cost on the time taken for reinstating the job and the risk of losing data and execution accomplished by the job before it failed. Approaches which can proactively detect computing core failures and take action to relocate the computing core's job onto reliable cores can make a significant step towards automating fault tolerance. Method: This paper describes an experimental investigation into the use of multi-agent approaches for fault tolerance. Two approaches are studied, the first at the job level and the second at the core level. The approaches are investigated for single core failure scenarios that can occur in the execution of parallel reduction algorithms on computer clusters. A third approach is proposed that incorporates multi-agent technology both at the job and core level. Experiments are pursued in the context of genome searching, a popular computational biology application. Result: The key conclusion is that the approaches proposed are feasible for automating fault tolerance in high-performance computing systems with minimal human intervention. In a typical experiment in which the fault tolerance is studied, centralised and decentralised checkpointing approaches on an average add 90% to the actual time for executing the job. On the other hand, in the same experiment the multi-agent approaches add only 10% to the overall execution time.Comment: Computers in Biology and Medicin

    Parameterized Algorithmics for Computational Social Choice: Nine Research Challenges

    Full text link
    Computational Social Choice is an interdisciplinary research area involving Economics, Political Science, and Social Science on the one side, and Mathematics and Computer Science (including Artificial Intelligence and Multiagent Systems) on the other side. Typical computational problems studied in this field include the vulnerability of voting procedures against attacks, or preference aggregation in multi-agent systems. Parameterized Algorithmics is a subfield of Theoretical Computer Science seeking to exploit meaningful problem-specific parameters in order to identify tractable special cases of in general computationally hard problems. In this paper, we propose nine of our favorite research challenges concerning the parameterized complexity of problems appearing in this context

    Gender and Age Related Effects While Watching TV Advertisements: An EEG Study

    Get PDF
    The aim of the present paper is to show how the variation of the EEG frontal cortical asymmetry is related to the general appreciation perceived during the observation of TV advertisements, in particular considering the influence of the gender and age on it. In particular, we investigated the influence of the gender on the perception of a car advertisement (Experiment 1) and the influence of the factor age on a chewing gum commercial (Experiment 2). Experiment 1 results showed statistically significant higher approach values for the men group throughout the commercial. Results from Experiment 2 showed significant lower values by older adults for the spot, containing scenes not very enjoyed by them. In both studies, there was no statistical significant difference in the scene relative to the product offering between the experimental populations, suggesting the absence in our study of a bias towards the specific product in the evaluated populations. These evidences state the importance of the creativity in advertising, in order to attract the target population

    Neurophysiological Responses to Different Product Experiences

    Get PDF
    It is well known that the evaluation of a product from the shelf considers the simultaneous cerebral and emotional evaluation of the different qualities of the product such as its colour, the eventual images shown, and the envelope’s texture (hereafter all included in the term “product experience”). However, the measurement of cerebral and emotional reactions during the interaction with food products has not been investigated in depth in specialized literature. (e aim of this paper was to investigate such reactions by the EEG and the autonomic activities, as elicited by the cross-sensory interaction (sight and touch) across several different products. In addition, we investigated whether (i) the brand (Major Brand or Private Label), (ii) the familiarity (Foreign or Local Brand), and (iii) the hedonic value of products (Comfort Food or Daily Food) influenced the reaction of a group of volunteers during their interaction with the products. Results showed statistically significantly higher tendency of cerebral approach (as indexed by EEG frontal alpha asymmetry) in response to comfort food during the visual exploration and the visual and tactile exploration phases. Furthermore, for the same index, a higher tendency of approach has been found toward foreign food products in comparison with local food products during the visual and tactile exploration phase. Finally, the same comparison performed on a different index (EEG frontal theta) showed higher mental effort during the interaction with foreign products during the visual exploration and the visual and tactile exploration phases. Results from the present study could deepen the knowledge on the neurophysiological response to food products characterized by different nature in terms of hedonic value familiarity; moreover, they could have implications for food marketers and finally lead to further study on how people make food choices through the interactions with their commercial envelope

    A LightGBM-Based EEG Analysis Method for Driver Mental States Classification

    Get PDF
    Fatigue driving can easily lead to road traffic accidents and bring great harm to individuals and families. Recently, electroencephalography- (EEG-) based physiological and brain activities for fatigue detection have been increasingly investigated. However, how to find an effective method or model to timely and efficiently detect the mental states of drivers still remains a challenge. In this paper, we combine common spatial pattern (CSP) and propose a light-weighted classifier, LightFD, which is based on gradient boosting framework for EEG mental states identification. ,e comparable results with traditional classifiers, such as support vector machine (SVM), convolutional neural network (CNN), gated recurrent unit (GRU), and large margin nearest neighbor (LMNN), show that the proposed model could achieve better classification performance, as well as the decision efficiency. Furthermore, we also test and validate that LightFD has better transfer learning performance in EEG classification of driver mental states. In summary, our proposed LightFD classifier has better performance in real-time EEG mental state prediction, and it is expected to have broad application prospects in practical brain-computer interaction (BCI)
    • …
    corecore