10,056 research outputs found
Emotions in context: examining pervasive affective sensing systems, applications, and analyses
Pervasive sensing has opened up new opportunities for measuring our feelings and understanding our behavior by monitoring our affective states while mobile. This review paper surveys pervasive affect sensing by examining and considering three major elements of affective pervasive systems, namely; âsensingâ, âanalysisâ, and âapplicationâ. Sensing investigates the different sensing modalities that are used in existing real-time affective applications, Analysis explores different approaches to emotion recognition and visualization based on different types of collected data, and Application investigates different leading areas of affective applications. For each of the three aspects, the paper includes an extensive survey of the literature and finally outlines some of challenges and future research opportunities of affective sensing in the context of pervasive computing
Icanlearn: A Mobile Application For Creating Flashcards And Social Stories\u3csup\u3etm\u3c/sup\u3e For Children With Autistm
The number of children being diagnosed with Autism Spectrum Disorder (ASD) is on the rise, presenting new challenges for their parents and teachers to overcome. At the same time, mobile computing has been seeping its way into every aspect of our lives in the form of smartphones and tablet computers. It seems only natural to harness the unique medium these devices provide and use it in treatment and intervention for children with autism.
This thesis discusses and evaluates iCanLearn, an iOS flashcard app with enough versatility to construct Social StoriesTM. iCanLearn provides an engaging, individualized learning experience to children with autism on a single device, but the most powerful way to use iCanLearn is by connecting two or more devices together in a teacher-learner relationship. The evaluation results are presented at the end of the thesis
Multi-Moji: Combining Thermal, Vibrotactile and Visual Stimuli to Expand the Affective Range of Feedback
This paper explores the combination of multiple concurrent
modalities for conveying emotional information in HCI:
temperature, vibration and abstract visual displays. Each modality
has been studied individually, but can only convey a
limited range of emotions within two-dimensional valencearousal
space. This paper is the first to systematically combine
multiple modalities to expand the available affective
range. Three studies were conducted: Study 1 measured the
emotionality of vibrotactile feedback by itself; Study 2 measured
the perceived emotional content of three bimodal combinations:
vibrotactile + thermal, vibrotactile + visual and
visual + thermal. Study 3 then combined all three modalities.
Results show that combining modalities increases the available
range of emotional states, particularly in the problematic
top-right and bottom-left quadrants of the dimensional
model. We also provide a novel lookup resource for designers
to identify stimuli to convey a range of emotions
Nose Heat: Exploring Stress-induced Nasal Thermal Variability through Mobile Thermal Imaging
Automatically monitoring and quantifying stress-induced thermal dynamic
information in real-world settings is an extremely important but challenging
problem. In this paper, we explore whether we can use mobile thermal imaging to
measure the rich physiological cues of mental stress that can be deduced from a
person's nose temperature. To answer this question we build i) a framework for
monitoring nasal thermal variable patterns continuously and ii) a novel set of
thermal variability metrics to capture a richness of the dynamic information.
We evaluated our approach in a series of studies including laboratory-based
psychosocial stress-induction tasks and real-world factory settings. We
demonstrate our approach has the potential for assessing stress responses
beyond controlled laboratory settings
What does touch tell us about emotions in touchscreen-based gameplay?
This is the post-print version of the Article. The official published version can be accessed from the link below - Copyright @ 2012 ACM. It is posted here by permission of ACM for your personal use. Not for redistribution.Nowadays, more and more people play games on touch-screen mobile phones. This phenomenon raises a very interesting question: does touch behaviour reflect the playerâs emotional state? If possible, this would not only be a valuable evaluation indicator for game designers, but also for real-time personalization of the game experience. Psychology studies on acted touch behaviour show the existence of discriminative affective profiles. In this paper, finger-stroke features during gameplay on an iPod were extracted and their discriminative power analysed. Based on touch-behaviour, machine learning algorithms were used to build systems for automatically discriminating between four emotional states (Excited, Relaxed, Frustrated, Bored), two levels of arousal and two levels of valence. The results were very interesting reaching between 69% and 77% of correct discrimination between the four emotional states. Higher results (~89%) were obtained for discriminating between two levels of arousal and two levels of valence
- âŠ