94 research outputs found

    Human recognition of basic emotions from posed and animated dynamic facial expressions

    Get PDF
    Facial expressions are crucial for social communication, especially because they make it possible to express and perceive unspoken emotional and mental states. For example, neurodevelopmental disorders with social communication deficits, such as Asperger Syndrome (AS), often involve difficulties in interpreting emotional states from the facial expressions of others. Rather little is known of the role of dynamics in recognizing emotions from faces. Better recognition of dynamic rather than static facial expressions of six basic emotions has been reported with animated faces; however, this result hasn't been confirmed reliably with real human faces. This thesis evaluates the role of dynamics in recognizing basic expressions from animated and human faces. With human faces, the further interaction between dynamics and the effect of removing fine details by low-pass filtering (blurring) is studied in adult individuals with and without AS. The results confirmed that dynamics facilitates the recognition of emotional facial expressions. This effect, however, was apparent only with the facial animation stimuli lacking detailed static facial features and other emotional cues and with blurred human faces. Some dynamic emotional animations were recognized drastically better than static ones. With basic expressions posed by human actors, the advantage of dynamic vs. static displays increased as a function of the blur level. Participants with and without AS performed similarly in recognizing basic emotions from original non-filtered and from dynamic vs. static facial expressions, suggesting that AS involves intact recognition of simple emotional states and movement from faces. Participants with AS were affected more by the removal of fine details than participants without AS. This result supports a "weak central coherence" account suggesting that AS and other autistic spectrum disorders are characterized by general perceptual difficulties in processing global vs. local level features.Kasvonilmeet ovat tärkeä osa sosiaalista vuorovaikutusta, erityisesti koska ne tekevät ääneen lausumattomien tunnetilojen ilmaisemisen ja havaitsemisen mahdolliseksi. Esimerkiksi sosiaalisen vuorovaikutuksen ongelmia sisältäviin neurokehityksellisiin oireyhtymiin, kuten Aspergerin Syndroomaan (AS), liittyykin usein vaikeuksia kasvoilla näkyvien tunnetilojen tulkitsemisessa. Liikkeen roolista tunneilmausten tunnistamisessa kasvoilta on olemassa vain vähän tietoa. On osoitettu, että dynaamiset perustunneilmaukset tunnistetaan staattisia paremmin tietokoneanimoiduilta kasvoilta, vastaavaa tulosta ei ole kuitenkaan varmennettu ihmiskasvoilla. Tässä väitöskirjassa tutkitaan liikkeen roolia perustunneilmausten tunnistamisessa animoiduilta- ja ihmiskasvoilta. Ihmiskasvojen tapauksessa tutkitaan vuorovaikutusta liikkeen ja alipäästösuodatuksen (sumennuksen) kautta tapahtuvan tarkkojen yksityiskohtien poistamisen välillä. Tätä kysymystä tutkitaan lisäksi erikseen henkilöillä, joilla ei ole viitteitä AS:sta ja henkilöillä joilla on todettu AS. Tulokset vahvistivat, että liike edesauttaa tunneilmausten tunnistamista kasvoilta. Tämä tulos oli kuitenkin havaittavissa vain käytetyillä kasvoanimaatioilla, joista puuttui kasvojen tarkkoja yksityiskohtia ja muita tunteisiin liittyviä vihjeitä sekä sumennetuilla ihmiskasvoilla. Jotkin dynaamiset tunneanimaatiot tunnistettiin huomattavasti staattisia paremmin. Ihmisnäyttelijöiden esittämien perustunneilmausten tapauksessa, liikkeen tuoma lisähyöty kasvoi käytetyn sumennustason funktiona. Osallistujat, joilla oli todettu AS, tunnistivat perustunneilmauksia yhtä hyvin alkuperäisiltä ei-sumennetuilta kasvoilta ja dynaamisilta vs. staattisilta kasvoilta kuin muutkin osallistujat. Tulokset antavat viitteitä vahingoittumasta yksinkertaisten tunneilmausten ja liikkeen tunnistamisesta kasvoilta Aspergerin Syndroomassa. Osallistujat, joilla oli AS, suoriutuivat muita osallistujia heikommin, kun esitetyistä ärsykkeistä oli poistettu tarkkoja yksityiskohtia. Tämä tulos on yhdenmukainen "heikoksi keskeiseksi koherenssiksi" nimetyn näkemyksen kanssa, jonka mukaan AS:aan ja muihin autismin kirjon häiriöihin liittyy havaitsemistason vaikeuksia yleisten vs. tarkkojen piirteiden prosessoinnissa.reviewe

    Intragroup Emotions : Physiological Linkage and Social Presence

    Get PDF
    We investigated how technologically mediating two different components of emotion communicative expression and physiological state to group members affects physiological linkage and self-reported feelings in a small group during video viewing. In different conditions the availability of second screen text chat (communicative expression) and visualization of group level physiological heart rates and their dyadic linkage (physiology) was varied. Within this four person group two participants formed a physically co-located dyad and the other two were individually situated in two separate rooms. We found that text chat always increased heart rate synchrony but HR visualization only with non-co-located dyads. We also found that physiological linkage was strongly connected to self-reported social presence. The results encourage further exploration of the possibilities of sharing group member's physiological components of emotion by technological means to enhance mediated communication and strengthen social presence.Peer reviewe

    Virtual character facial expressions influence human brain and facial EMG activity in a decision-making game

    Get PDF
    We examined the effects of the emotional facial expressions of a virtual character (VC) on human frontal electroencephalographic (EEG) asymmetry (putatively indexing approach/withdrawal motivation), facial electromyographic (EMG) activity (emotional expressions), and social decision making (cooperation/defection). In a within-subjects design, the participants played the Iterated Prisoner's Dilemma game with VCs with different dynamic facial expressions (predefined or dependent on the participant's electrodermal and facial EMG activity). In general, VC facial expressions elicited congruent facial muscle activity. However, both frontal EEG asymmetry and facial EMG activity elicited by an angry VC facial expression varied as a function of preceding interactional events (human collaboration/defection). Pre-decision inner emotional-motivational processes and emotional facial expressions were dissociated, suggesting that human goals influence pre-decision frontal asymmetry, whereas display rules may affect (pre-decision) emotional expressions in human-VC interaction. An angry VC facial expression, high pre-decision corrugator EMG activity, and relatively greater left frontal activation predicted the participant's decision to defect. Both post-decision frontal asymmetry and facial EMG activity were related to reciprocal cooperation. The results suggest that the justifiability of VC emotional expressions and the perceived fairness of VC actions influence human emotional responses.Peer reviewe

    Empowerment and embodiment for collaborative mixed reality systems: Empowerment and Embodiment

    Get PDF
    We present several mixed‐reality‐based remote collaboration settings by using consumer head‐mounted displays. We investigated how two people are able to work together in these settings. We found that the person in the AR system will be regarded as the “leader” (i.e., they provide a greater contribution to the collaboration), whereas no similar “leader” emerges in augmented reality (AR)‐to‐AR and AR‐to‐VRBody settings. We also found that these special patterns of leadership only emerged for 3D interactions and not for 2D interactions. Results about the participants' experience of leadership, collaboration, embodiment, presence, and copresence shed further light on these findings

    The effect of spatial frequency and face inversion on facial expression processing in children with autism spectrum disorder

    Get PDF
    To investigate whether facial expression processing in children with autism spectrum disorder (ASD) is based on local information of the stimuli, we prepared low spatial frequency (LSF) images with blurred facial features and high spatial frequency (HSF) images with rich facial features from broad (normal) spatial frequency (BSF) images. Eighteen children with ASD (mean age 11.9 years) and 19 typically developing (TD) children (mean age 11.4 years) matched on nonverbal IQ were presented these stimuli in upright and inverted orientations. The children with ASD had difficulty in processing facial expressions from the BSF and LSF images, but not from the HSF images. In addition, the BSF and HSF images elicited the inversion effect in the TD children, but not in the children with ASD. In contrast, the LSF images elicited the inversion effect in both groups of children. These results suggest that children with ASD are biased towards processing facial expression based on local information, even though their capacity to process facial expressions configurally is spared

    A systematic review of how emotional self-awareness is defined and measured when comparing autistic and non-autistic groups

    Get PDF
    We would like to sincerely thank all the authors who shared their data with us. We would also like to thank Ira Lesser, Taylor Graeme, and Arvid Heiberg for kindly sharing their articles for the historical review. Review was conduced as part of CFH's PhD studies. We would like to thank the Northwood Trust, UK for their financial support for this research. Research data available upon request from first author.Peer reviewedPublisher PD
    corecore