68 research outputs found

    Ubiquitous Emotion Analytics and How We Feel Today

    Get PDF
    Emotions are complicated. Humans feel deeply, and it can be hard to bring clarity to those depths, to communicate about feelings, or to understand others’ emotional states. Indeed, this emotional confusion is one of the biggest challenges of deciphering our humanity. However, a kind of hope might be on the horizon, in the form of emotion analytics: computerized tools for recognizing and responding to emotion. This analysis explores how emotion analytics may reflect the current status of humans’ regard for emotion. Emotion need no longer be a human sense of vague, indefinable feelings; instead, emotion is in the process of becoming a legible, standardized commodity that can be sold, managed, and altered to suit the needs of those in power. Emotional autonomy and authority can be surrendered to those technologies in exchange for perceived self-determination. Emotion analytics promises a new orderliness to the messiness of human emotions, suggesting that our current state of emotional uncertainty is inadequate and intolerable

    Both Facts and Feelings: Emotion and News Literacy

    Get PDF
    News literacy education has long focused on the significance of facts, sourcing, and verifiability. While these are critical aspects of news, rapidly developing emotion analytics technologies intended to respond to and even alter digital news audiences’ emotions also demand that we pay greater attention to the role of emotion in news consumption. This essay explores the role of emotion in the “fake news” phenomenon and the implementation of emotion analytics tools in news distribution. I examine the function of emotion in news consumption and the status of emotion within existing news literacy training programs. Finally, I offer suggestions for addressing emotional responses to news with students, including both mindfulness techniques and psychological research on thinking processes

    3d Face Reconstruction And Emotion Analytics With Part-Based Morphable Models

    Get PDF
    3D face reconstruction and facial expression analytics using 3D facial data are new and hot research topics in computer graphics and computer vision. In this proposal, we first review the background knowledge for emotion analytics using 3D morphable face model, including geometry feature-based methods, statistic model-based methods and more advanced deep learning-bade methods. Then, we introduce a novel 3D face modeling and reconstruction solution that robustly and accurately acquires 3D face models from a couple of images captured by a single smartphone camera. Two selfie photos of a subject taken from the front and side are used to guide our Non-Negative Matrix Factorization (NMF) induced part-based face model to iteratively reconstruct an initial 3D face of the subject. Then, an iterative detail updating method is applied to the initial generated 3D face to reconstruct facial details through optimizing lighting parameters and local depths. Our iterative 3D face reconstruction method permits fully automatic registration of a part-based face representation to the acquired face data and the detailed 2D/3D features to build a high-quality 3D face model. The NMF part-based face representation learned from a 3D face database facilitates effective global and adaptive local detail data fitting alternatively. Our system is flexible and it allows users to conduct the capture in any uncontrolled environment. We demonstrate the capability of our method by allowing users to capture and reconstruct their 3D faces by themselves. Based on the 3D face model reconstruction, we can analyze the facial expression and the related emotion in 3D space. We present a novel approach to analyze the facial expressions from images and a quantitative information visualization scheme for exploring this type of visual data. From the reconstructed result using NMF part-based morphable 3D face model, basis parameters and a displacement map are extracted as features for facial emotion analysis and visualization. Based upon the features, two Support Vector Regressions (SVRs) are trained to determine the fuzzy Valence-Arousal (VA) values to quantify the emotions. The continuously changing emotion status can be intuitively analyzed by visualizing the VA values in VA-space. Our emotion analysis and visualization system, based on 3D NMF morphable face model, detects expressions robustly from various head poses, face sizes and lighting conditions, and is fully automatic to compute the VA values from images or a sequence of video with various facial expressions. To evaluate our novel method, we test our system on publicly available databases and evaluate the emotion analysis and visualization results. We also apply our method to quantifying emotion changes during motivational interviews. These experiments and applications demonstrate effectiveness and accuracy of our method. In order to improve the expression recognition accuracy, we present a facial expression recognition approach with 3D Mesh Convolutional Neural Network (3DMCNN) and a visual analytics guided 3DMCNN design and optimization scheme. The geometric properties of the surface is computed using the 3D face model of a subject with facial expressions. Instead of using regular Convolutional Neural Network (CNN) to learn intensities of the facial images, we convolve the geometric properties on the surface of the 3D model using 3DMCNN. We design a geodesic distance-based convolution method to overcome the difficulties raised from the irregular sampling of the face surface mesh. We further present an interactive visual analytics for the purpose of designing and modifying the networks to analyze the learned features and cluster similar nodes in 3DMCNN. By removing low activity nodes in the network, the performance of the network is greatly improved. We compare our method with the regular CNN-based method by interactively visualizing each layer of the networks and analyze the effectiveness of our method by studying representative cases. Testing on public datasets, our method achieves a higher recognition accuracy than traditional image-based CNN and other 3D CNNs. The presented framework, including 3DMCNN and interactive visual analytics of the CNN, can be extended to other applications

    FACETEQ interface demo for emotion expression in VR

    Get PDF
    © 2017 IEEE.Faceteq prototype v.05 is a wearable technology for measuring facial expressions and biometric responses for experimental studies in Virtual Reality. Developed by Emteq Ltd laboratory, Faceteq can enable new avenues for virtual reality research through combination of high performance patented dry sensor technologies, proprietary algorithms and real-time data acquisition and streaming. Emteq founded the Faceteq project with the aim to provide a human-centered additional tool for emotion expression, affective human-computer interaction and social virtual environments. The proposed demonstration will exhibit the hardware and its functionality by allowing attendees to experience three of the showcasing applications we developed this year

    An Experimental Comparison of Two Machine Learning Approaches for Emotion Classification

    Get PDF
    Correctly identifying an emotion has always been challenging for humans, not to mention machines! In this research, we use machine learning to classify human emotion. Emotional differences between genders are well documented in fields like psychology. We hypothesize that genders will impact the accuracy of classifying emotion with machine learning. Two different machine learning approaches were tested in an experimental study. In one approach, emotions from both genders were used to train the machine. In another approach, the genders were separated and two separate machines were used to learn the emotions of the two genders. Our preliminary results show that the approach where the genders were separated produces higher accuracy in classifying emotion. Keywords Emotion classification, facial expression, sexes, machine learning

    Retail managers’ preparedness to capture customers’ emotions: a new synergistic framework to exploit unstructured data with new analytics

    Get PDF
    Although emotions have been investigated within strategic management literature from an internal perspective, managers’ ability and willingness to understand consumers’ emotions, with emphasis on the retail sector, is still a scarcely explored theme in management research. The aim of this paper is to explore the match between the supply of new analytical tools and retail managers’ attitudes towards new tools to capture customers’ emotions. To this end, Study 1 uses machine learning algorithms to develop a new system to analytically detect emotional responses from customers’ static images (considering the exemplar emotions of happiness and sadness), whilst Study 2 consults management decision-makers to explore the practical utility of such emotion recognition systems, finding a likely demand for a number of applications, albeit tempered by concern for ethical issues. While contributing to the retail management literature with regard to customers’ emotions and big data analytics, the findings also provide a new framework to support retail managers in using new analytics to survive and thrive in difficult times

    EU law and emotion data

    Full text link
    This article sheds light on legal implications and challenges surrounding emotion data processing within the EU's legal framework. Despite the sensitive nature of emotion data, the GDPR does not categorize it as special data, resulting in a lack of comprehensive protection. The article also discusses the nuances of different approaches to affective computing and their relevance to the processing of special data under the GDPR. Moreover, it points to potential tensions with data protection principles, such as fairness and accuracy. Our article also highlights some of the consequences, including harm, that processing of emotion data may have for individuals concerned. Additionally, we discuss how the AI Act proposal intends to regulate affective computing. Finally, the article outlines the new obligations and transparency requirements introduced by the DSA for online platforms utilizing emotion data. Our article aims at raising awareness among the affective computing community about the applicable legal requirements when developing AC systems intended for the EU market, or when working with study participants located in the EU. We also stress the importance of protecting the fundamental rights of individuals even when the law struggles to keep up with technological developments that capture sensitive emotion data.Comment: 8 pages, 2023 11th International Conference on Affective Computing and Intelligent Interaction (ACII
    • …
    corecore