222,014 research outputs found

    Allocation in Practice

    Full text link
    How do we allocate scarcere sources? How do we fairly allocate costs? These are two pressing challenges facing society today. I discuss two recent projects at NICTA concerning resource and cost allocation. In the first, we have been working with FoodBank Local, a social startup working in collaboration with food bank charities around the world to optimise the logistics of collecting and distributing donated food. Before we can distribute this food, we must decide how to allocate it to different charities and food kitchens. This gives rise to a fair division problem with several new dimensions, rarely considered in the literature. In the second, we have been looking at cost allocation within the distribution network of a large multinational company. This also has several new dimensions rarely considered in the literature.Comment: To appear in Proc. of 37th edition of the German Conference on Artificial Intelligence (KI 2014), Springer LNC

    Evaluation of second-level inference in fMRI analysis

    Get PDF
    We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference

    Parameterized Algorithmics for Computational Social Choice: Nine Research Challenges

    Full text link
    Computational Social Choice is an interdisciplinary research area involving Economics, Political Science, and Social Science on the one side, and Mathematics and Computer Science (including Artificial Intelligence and Multiagent Systems) on the other side. Typical computational problems studied in this field include the vulnerability of voting procedures against attacks, or preference aggregation in multi-agent systems. Parameterized Algorithmics is a subfield of Theoretical Computer Science seeking to exploit meaningful problem-specific parameters in order to identify tractable special cases of in general computationally hard problems. In this paper, we propose nine of our favorite research challenges concerning the parameterized complexity of problems appearing in this context

    Gender and Age Related Effects While Watching TV Advertisements: An EEG Study

    Get PDF
    The aim of the present paper is to show how the variation of the EEG frontal cortical asymmetry is related to the general appreciation perceived during the observation of TV advertisements, in particular considering the influence of the gender and age on it. In particular, we investigated the influence of the gender on the perception of a car advertisement (Experiment 1) and the influence of the factor age on a chewing gum commercial (Experiment 2). Experiment 1 results showed statistically significant higher approach values for the men group throughout the commercial. Results from Experiment 2 showed significant lower values by older adults for the spot, containing scenes not very enjoyed by them. In both studies, there was no statistical significant difference in the scene relative to the product offering between the experimental populations, suggesting the absence in our study of a bias towards the specific product in the evaluated populations. These evidences state the importance of the creativity in advertising, in order to attract the target population

    Automating Fault Tolerance in High-Performance Computational Biological Jobs Using Multi-Agent Approaches

    Get PDF
    Background: Large-scale biological jobs on high-performance computing systems require manual intervention if one or more computing cores on which they execute fail. This places not only a cost on the maintenance of the job, but also a cost on the time taken for reinstating the job and the risk of losing data and execution accomplished by the job before it failed. Approaches which can proactively detect computing core failures and take action to relocate the computing core's job onto reliable cores can make a significant step towards automating fault tolerance. Method: This paper describes an experimental investigation into the use of multi-agent approaches for fault tolerance. Two approaches are studied, the first at the job level and the second at the core level. The approaches are investigated for single core failure scenarios that can occur in the execution of parallel reduction algorithms on computer clusters. A third approach is proposed that incorporates multi-agent technology both at the job and core level. Experiments are pursued in the context of genome searching, a popular computational biology application. Result: The key conclusion is that the approaches proposed are feasible for automating fault tolerance in high-performance computing systems with minimal human intervention. In a typical experiment in which the fault tolerance is studied, centralised and decentralised checkpointing approaches on an average add 90% to the actual time for executing the job. On the other hand, in the same experiment the multi-agent approaches add only 10% to the overall execution time.Comment: Computers in Biology and Medicin

    Neurophysiological Profile of Antismoking Campaigns

    Get PDF
    Over the past few decades, antismoking public service announcements (PSAs) have been used by governments to promote healthy behaviours in citizens, for instance, against drinking before the drive and against smoke. Effectiveness of such PSAs has been suggested especially for young persons. By now, PSAs efficacy is still mainly assessed through traditional methods (questionnaires and metrics) and could be performed only after the PSAs broadcasting, leading to waste of economic resources and time in the case of Ineffective PSAs. One possible countermeasure to such ineffective use of PSAs could be promoted by the evaluation of the cerebral reaction to the PSA of particular segments of population (e.g., old, young, and heavy smokers). In addition, it is crucial to gather such cerebral activity in front of PSAs that have been assessed to be effective against smoke (Effective PSAs), comparing results to the cerebral reactions to PSAs that have been certified to be not effective (Ineffective PSAs). &e eventual differences between the cerebral responses toward the two PSA groups will provide crucial information about the possible outcome of new PSAs before to its broadcasting. &is study focused on adult population, by investigating the cerebral reaction to the vision of different PSA images, which have already been shown to be Effective and Ineffective for the promotion of an antismoking behaviour. Results showed how variables as gender and smoking habits can influence the perception of PSA images, and how different communication styles of the antismoking campaigns could facilitate the comprehension of PSA’s message and then enhance the related impac

    EEG Resting-State Brain Topological Reorganization as a Function of Age

    Get PDF
    Resting state connectivity has been increasingly studied to investigate the effects of aging on the brain. A reduced organization in the communication between brain areas was demonstrated b y combining a variety of different imaging technologies (fMRI, EEG, and MEG) and graph theory. In this paper, we propose a methodology to get new insights into resting state connectivity and its variations with age, by combining advanced techniques of effective connectivity estimation, graph theoretical approach, and classification by SVM method. We analyzed high density EEG signal srecordedatrestfrom71healthysubjects(age:20–63years). Weighted and directed connectivity was computed by means of Partial Directed Coherence based on a General Linear Kalman filter approach. To keep the information collected by the estimator, weighted and directed graph indices were extracted from the resulting networks. A relation between brain network properties and age of the subject was found, indicating a tendency of the network to randomly organize increasing with age. This result is also confirmed dividing the whole population into two subgroups according to the age (young and middle-aged adults): significant differences exist in terms of network organization measures. Classification of the subjects by means of such indices returns an accuracy greater than 80
    • …
    corecore