22 research outputs found

    Studying Amphiphilic Self-assembly with Soft Coarse-Grained Models

    Full text link

    Trusting to Learn: Trust and Privacy Issues in Serious Games

    No full text
    Organizations are increasingly investing in technology-enhanced learning systems to improve their employees’ skills. Serious games are one example; the competitive and fun nature of games is supposed to motivate employee participation. But any system that records employee data raises issues of privacy and trust. In this paper, we present a study on privacy and trust implications of serious games in an organizational context. We present findings from 32 interviews with potential end-users of a serious games platform called TARGET. A qualitative analysis of the interviews reveals that participants anticipate privacy risks for the data generated in game playing, and their decision to trust their fellow employees and managers depends on the presence of specific trust signals. Failure to minimize privacy risks and maximize trust will affect the acceptance of the system and the learning experience – thus undermining the primary purpose for which it was deployed. Game designers are advised to provide mechanisms for selective disclosure of data by players, and organizations should not use gaming data for appraisal or selection purposes, and clearly communicate this to employees
    corecore