26 research outputs found

    Emotion on the Road—Necessity, Acceptance, and Feasibility of Affective Computing in the Car

    Get PDF
    Besides reduction of energy consumption, which implies alternate actuation and light construction, the main research domain in automobile development in the near future is dominated by driver assistance and natural driver-car communication. The ability of a car to understand natural speech and provide a human-like driver assistance system can be expected to be a factor decisive for market success on par with automatic driving systems. Emotional factors and affective states are thereby crucial for enhanced safety and comfort. This paper gives an extensive literature overview on work related to influence of emotions on driving safety and comfort, automatic recognition, control of emotions, and improvement of in-car interfaces by affect sensitive technology. Various use-case scenarios are outlined as possible applications for emotion-oriented technology in the vehicle. The possible acceptance of such future technology by drivers is assessed in a Wizard-Of-Oz user study, and feasibility of automatically recognising various driver states is demonstrated by an example system for monitoring driver attentiveness. Thereby an accuracy of 91.3% is reported for classifying in real-time whether the driver is attentive or distracted

    AutoEmotive: bringing empathy to the driving experience to manage stress

    Get PDF
    With recent developments in sensing technologies, it's becoming feasible to comfortably measure several aspects of emotions during challenging daily life situations. This work describes how the stress of drivers can be measured through different types of interactions, and how the information can enable several interactions in the car with the goal of helping to manage stress. These new interactions could help not only to bring empathy to the driving experience but also to improve driver safety and increase social awareness.MIT Media Lab ConsortiumNational Science Foundation (U.S.) (Grant No. NSF CCF- 1029585

    User Experience (UX) and User Interface (UI) as a New Recipe of Academic Culture in Creative Industry

    Get PDF
    This preliminary study aims to introduce User Experience (UX) and User Interface (UI) as a new recipe that could be implemented in academic culture to make sure the adaptation human and technology will be more effective and improve the creative industry. This study uses a survey questionnaire on purposive sampling to investigate the knowledge perception of 30 design practitioners and structured interview of 9 random samplings to know their perspectives. This study concluded that 60% of design practitioners realize the needs of this new approach, while the interview leads to the three main themes: acceptance level, vitality awareness and tendency expectation. Keywords: Design Emotion, User Experience, User Interface, Academic Culture, Creative Industries eISSN: 2398-4287© 2020. The Authors. Published for AMER ABRA cE-Bs by e-International Publishing House, Ltd., UK. This is an open access article under the CC BYNC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer–review under responsibility of AMER (Association of Malaysian Environment-Behaviour Researchers), ABRA (Association of Behavioural Researchers on Asians) and cE-Bs (Centre for Environment-Behaviour Studies), Faculty of Architecture, Planning & Surveying, Universiti Teknologi MARA, Malaysia. DOI: https://doi.org/10.21834/ebpj.v5iSI3.255

    Quantification of vascular function changes under different emotion states: A pilot study

    Get PDF
    Recent studies have indicated that physiological parameters change with different emotion states. This study aimed to quantify the changes of vascular function at different emotion and sub-emotion states. Twenty young subjects were studied with their finger photoplethysmographic (PPG) pulses recorded at three distinct emotion states: natural (1 minute), happiness and sadness (10 minutes for each). Within the period of happiness and sadness emotion states, two sub-emotion states (calmness and outburst) were identified with the synchronously recorded videos. Reflection index (RI) and stiffness index (SI), two widely used indices of vascular function, were derived from the PPG pulses to quantify their differences between three emotion states, as well as between two sub-emotion states. The results showed that, when compared with the natural emotion, RI and SI decreased in both happiness and sadness emotions. The decreases in RI were significant for both happiness and sadness emotions (both P< 0.01), but the decreases in SI was only significant for sadness emotion (P< 0.01). Moreover, for comparing happiness and sadness emotions, there was significant difference in RI (P< 0.01), but not in SI (P= 0.9). In addition, significant larger RI values were observed with the outburst sub-emotion in comparison with the calmness one for both happiness and sadness emotions (both P< 0.01) whereas significant larger SI values were observed with the outburst sub-emotion only in sadness emotion (P< 0.05). Moreover, gender factor hardly influence the RI and SI results for all three emotion measurements. This pilot study confirmed that vascular function changes with diffenrt emotion states could be quantified by the simple PPG measurement

    Designing for frustration and disputes in the family car

    Get PDF
    This article appears with the express permission of the publisher, IGI Global.Families spend an increasing amount of time in the car carrying out a number of activities including driving to work, caring for children and co-ordinating drop-offs and pickups. While families travelling in cars may face stress from difficult road conditions, they are also likely to be frustrated by coordinating a number of activities and resolving disputes within the confined space of car. A rising number of in-car infotainment and driver-assistance systems aim to help reduce the stress from outside the vehicle and improve the experience of driving but may fail to address sources of stress from within the car. From ethnographic studies of family car journeys, we examine the work of parents in managing multiple stresses while driving, along with the challenges of distractions from media use in the car. Keeping these family extracts as a focus for analysis, we draw out some design considerations that help build on the observations from our empirical work.Microsoft Research and the Dorothy Hodgkin Awar

    Employing consumer electronic devices in physiological and emotional evaluation of common driving activities

    Get PDF
    It is important to equip future vehicles with an on-board system capable of tracking and analysing driver state in real-time in order to mitigate the risk of human error occurrence in manual or semi-autonomous driving. This study aims to provide some supporting evidence for adoption of consumer grade electronic devices in driver state monitoring. The study adopted repeated measure design and was performed in high- fidelity driving simulator. Total of 39 participants of mixed age and gender have taken part in the user trials. The mobile application was developed to demonstrate how a mobile device can act as a host for a driver state monitoring system, support connectivity, synchronisation, and storage of driver state related measures from multiple devices. The results of this study showed that multiple physiological measures, sourced from consumer grade electronic devices, can be used to successfully distinguish task complexities across common driving activities. For instance, galvanic skin response and some heart rate derivatives were found to be correlated to overall subjective workload ratings. Furthermore, emotions were captured and showed to be affected by extreme driving situations

    Environnement virtuel générateur d’émotions

    Get PDF
    Les émotions jouent un rôle important dans la prise de décision quotidienne. En effet, elles influencent grandement la manière dont les individus interagissent avec leur environnement. Dans cette étude nous avons premièrement conçu un environnement virtuel de conduite automobile, puis créé des scénarios générateurs d’émotions à l’aide de la méthode Belief-Desire-Intention. Nous avons évalué l’efficacité de ces scénarios à l’aide d’un groupe de 30 personnes et d’un casque électroencéphalogramme pour mesurer leurs émotions. On observe que plus de 70% des scénarios conçus avec cette méthode ont généré l’émotion que l’on avait anticipée chez 52% à 76% des participants. La deuxième phase de cette expérience porte sur la réduction d’émotions avec un agent correcteur. Nous avons noté une efficacité de la réduction des émotions allant de 36.4% jusqu’à 70.0% des participants à travers les différents scénarios.Emotions play an important role in daily decision-making. Indeed, they greatly influence how individuals interact with their environment. In this study, we first designed a virtual driving environment and various emotion-inducing scenarios using the Belief-Desire-Intention method. We evaluated the effectiveness of these scenarios with a group of 30 people and an EEG headset to measure the emotions. Over 70% of scenarios designed with this method induced the emotion that had been anticipated in 52% to 76% of the participants. The second phase of this experiment is the reduction of emotions with a corrective agent. We noted an efficiency in reducing emotions ranging from 36.4% to 70.0% of the participants through the different scenarios

    Towards hybrid driver state monitoring : review, future perspectives and the role of consumer electronics

    Get PDF
    The purpose of this paper is to bring together multiple literature sources which present innovative methodologies for the assessment of driver state, driving context and performance by means of technology within a vehicle and consumer electronic devices. It also provides an overview of ongoing research and trends in the area of driver state monitoring. As part of this review a model of a hybrid driver state monitoring system is proposed. The model incorporates technology within a vehicle and multiple broughtin devices for enhanced validity and reliability of recorded data. Additionally, the model draws upon requirement of data fusion in order to generate unified driver state indicator(-s) that could be used to modify in-vehicle information and safety systems hence, make them driver state adaptable. Such modification could help to reach optimal driving performance in a particular driving situation. To conclude, we discuss the advantages of integrating hybrid driver state monitoring system into a vehicle and suggest future areas of research

    EmoSet: A Large-scale Visual Emotion Dataset with Rich Attributes

    Full text link
    Visual Emotion Analysis (VEA) aims at predicting people's emotional responses to visual stimuli. This is a promising, yet challenging, task in affective computing, which has drawn increasing attention in recent years. Most of the existing work in this area focuses on feature design, while little attention has been paid to dataset construction. In this work, we introduce EmoSet, the first large-scale visual emotion dataset annotated with rich attributes, which is superior to existing datasets in four aspects: scale, annotation richness, diversity, and data balance. EmoSet comprises 3.3 million images in total, with 118,102 of these images carefully labeled by human annotators, making it five times larger than the largest existing dataset. EmoSet includes images from social networks, as well as artistic images, and it is well balanced between different emotion categories. Motivated by psychological studies, in addition to emotion category, each image is also annotated with a set of describable emotion attributes: brightness, colorfulness, scene type, object class, facial expression, and human action, which can help understand visual emotions in a precise and interpretable way. The relevance of these emotion attributes is validated by analyzing the correlations between them and visual emotion, as well as by designing an attribute module to help visual emotion recognition. We believe EmoSet will bring some key insights and encourage further research in visual emotion analysis and understanding. Project page: https://vcc.tech/EmoSet.Comment: Accepted to ICCV2023, similar to the final versio
    corecore