976 research outputs found

    Assessing the Effectiveness of Automated Emotion Recognition in Adults and Children for Clinical Investigation

    Get PDF
    Recent success stories in automated object or face recognition, partly fuelled by deep learning artificial neural network (ANN) architectures, has led to the advancement of biometric research platforms and, to some extent, the resurrection of Artificial Intelligence (AI). In line with this general trend, inter-disciplinary approaches have taken place to automate the recognition of emotions in adults or children for the benefit of various applications such as identification of children emotions prior to a clinical investigation. Within this context, it turns out that automating emotion recognition is far from being straight forward with several challenges arising for both science(e.g., methodology underpinned by psychology) and technology (e.g., iMotions biometric research platform). In this paper, we present a methodology, experiment and interesting findings, which raise the following research questions for the recognition of emotions and attention in humans: a) adequacy of well-established techniques such as the International Affective Picture System (IAPS), b) adequacy of state-of-the-art biometric research platforms, c) the extent to which emotional responses may be different among children or adults. Our findings and first attempts to answer some of these research questions, are all based on a mixed sample of adults and children, who took part in the experiment resulting into a statistical analysis of numerous variables. These are related with, both automatically and interactively, captured responses of participants to a sample of IAPS pictures

    Emotion Detection Using Noninvasive Low Cost Sensors

    Full text link
    Emotion recognition from biometrics is relevant to a wide range of application domains, including healthcare. Existing approaches usually adopt multi-electrodes sensors that could be expensive or uncomfortable to be used in real-life situations. In this study, we investigate whether we can reliably recognize high vs. low emotional valence and arousal by relying on noninvasive low cost EEG, EMG, and GSR sensors. We report the results of an empirical study involving 19 subjects. We achieve state-of-the- art classification performance for both valence and arousal even in a cross-subject classification setting, which eliminates the need for individual training and tuning of classification models.Comment: To appear in Proceedings of ACII 2017, the Seventh International Conference on Affective Computing and Intelligent Interaction, San Antonio, TX, USA, Oct. 23-26, 201

    Plug-in to fear: game biosensors and negative physiological responses to music

    Get PDF
    The games industry is beginning to embark on an ambitious journey into the world of biometric gaming in search of more exciting and immersive gaming experiences. Whether or not biometric game technologies hold the key to unlock the “ultimate gaming experience” hinges not only on technological advancements alone but also on the game industry’s understanding of physiological responses to stimuli of different kinds, and its ability to interpret physiological data in terms of indicative meaning. With reference to horror genre games and music in particular, this article reviews some of the scientific literature relating to specific physiological responses induced by “fearful” or “unpleasant” musical stimuli, and considers some of the challenges facing the games industry in its quest for the ultimate “plugged-in” experience

    Investigating Biometric Response for Information Retrieval Applications

    Get PDF
    Current information retrieval systems make no measurement of the user’s response to the searching process or the information itself. Existing psychological studies show that subjects exhibit measurable physiological responses when carrying out certain tasks, e.g. when viewing images, which generally result in heightened emotional states. We find that users exhibit measurable biometric behaviour in the form of galvanic skin response when watching movies, and engaging in interactive tasks. We examine how this data might be exploited in the indexing of data for search and within the search process itself

    Recognizing Developers' Emotions while Programming

    Full text link
    Developers experience a wide range of emotions during programming tasks, which may have an impact on job performance. In this paper, we present an empirical study aimed at (i) investigating the link between emotion and progress, (ii) understanding the triggers for developers' emotions and the strategies to deal with negative ones, (iii) identifying the minimal set of non-invasive biometric sensors for emotion recognition during programming task. Results confirm previous findings about the relation between emotions and perceived productivity. Furthermore, we show that developers' emotions can be reliably recognized using only a wristband capturing the electrodermal activity and heart-related metrics.Comment: Accepted for publication at ICSE2020 Technical Trac

    Affective games:a multimodal classification system

    Get PDF
    Affective gaming is a relatively new field of research that exploits human emotions to influence gameplay for an enhanced player experience. Changes in player’s psychology reflect on their behaviour and physiology, hence recognition of such variation is a core element in affective games. Complementary sources of affect offer more reliable recognition, especially in contexts where one modality is partial or unavailable. As a multimodal recognition system, affect-aware games are subject to the practical difficulties met by traditional trained classifiers. In addition, inherited game-related challenges in terms of data collection and performance arise while attempting to sustain an acceptable level of immersion. Most existing scenarios employ sensors that offer limited freedom of movement resulting in less realistic experiences. Recent advances now offer technology that allows players to communicate more freely and naturally with the game, and furthermore, control it without the use of input devices. However, the affective game industry is still in its infancy and definitely needs to catch up with the current life-like level of adaptation provided by graphics and animation

    BED: A new dataset for EEG-based biometrics

    Get PDF
    Various recent research works have focused on the use of electroencephalography (EEG) signals in the field of biometrics. However, advances in this area have somehow been limited by the absence of a common testbed that would make it possible to easily compare the performance of different proposals. In this work, we present a dataset that has been specifically designed to allow researchers to attempt new biometric approaches that use EEG signals captured by using relatively inexpensive consumer-grade devices. The proposed dataset has been made publicly accessible and can be downloaded from https://doi.org/10.5281/zenodo.4309471. It contains EEG recordings and responses from 21 individuals, captured under 12 different stimuli across three sessions. The selected stimuli included traditional approaches, as well as stimuli that aim to elicit concrete affective states, in order to facilitate future studies related to the influence of emotions on the EEG signals in the context of biometrics. The captured data were checked for consistency and a performance study was also carried out in order to establish a baseline for the tasks of subject verification and identification

    Beyond Biometrics

    Get PDF
    Throughout the last 40 years, the essence of automated identification of users has remained the same. In this article, a new class of biometrics is proposed that is founded on processing biosignals, as opposed to images. After a brief introduction on biometrics, biosignals are discussed, including their advantages, disadvantages, and guidelines for obtaining them. This new class of biometrics increases biometrics’ robustness and enables cross validation. Next, biosignals’ use is illustrated by two biosignal-based biometrics: voice identification and handwriting recognition. Additionally, the concept of a digital human model is introduced. Last, some issues will be touched upon that will arise when biosignal-based biometrics are brought to practice

    EEG-based biometrics: Effects of template ageing

    Get PDF
    This chapter discusses the effects of template ageing in EEG-based biometrics. The chapter also serves as an introduction to general biometrics and its main tasks: Identification and verification. To do so, we investigate different characterisations of EEG signals and examine the difference of performance in subject identification between single session and cross-session identification experiments. In order to do this, EEG signals are characterised with common state-of-the-art features, i.e. Mel Frequency Cepstral Coefficients (MFCC), Autoregression Coefficients, and Power Spectral Density-derived features. The samples were later classified using various classifiers, including Support Vector Machines and k-Nearest Neighbours with different parametrisations. Results show that performance tends to be worse for crosssession identification compared to single session identification. This finding suggests that temporal permanence of EEG signals is limited and thus more sophisticated methods are needed in order to characterise EEG signals for the task of subject identificatio
    corecore