39,157 research outputs found

    Integrating body movement into attractiveness research

    Get PDF
    People judge attractiveness and make trait inferences from the physical appearance of others, and research reveals high agreement among observers making such judgments. Evolutionary psychologists have argued that interest in physical appearance and beauty reflects adaptations that motivate the search for desirable qualities in a potential partner. Although men more than women value the physical appearance of a partner, appearance universally affects social perception in both sexes. Most studies of attractiveness perceptions have focused on third party assessments of static representations of the face and body. Corroborating evidence suggests that body movement, such as dance, also conveys information about mate quality. Here we review evidence that dynamic cues (e.g., gait, dance) also influence perceptions of mate quality, including personality traits, strength, and overall attractiveness. We recommend that attractiveness research considers the informational value of body movement in addition to static cues, to present an integrated perspective on human social perception

    New Tests to Measure Individual Differences in Matching and Labelling Facial Expressions of Emotion, and Their Association with Ability to Recognise Vocal Emotions and Facial Identity

    Get PDF
    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing "individual differences"--that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach's alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity).This research was supported by the Australian Research Council (http://www.arc.gov.au/) grant DP110100850 to RP and EM and the Australian Research Council Centre of Excellence for Cognition and its Disorders (CE110001021) http://www.ccd.edu.au. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript

    The expression and assessment of emotions and internal states in individuals with severe or profound intellectual disabilities

    Get PDF
    The expression of emotions and internal states by individuals with severe or profound intellectual disabilities is a comparatively under-researched area. Comprehensive or standardised methods of assessing or understanding the emotions and internal states within this population, whose ability to communicate is significantly compromised, do not exist. The literature base will be discussed and compared to that within the general population. Methods of assessing broader internal states, notably depression, anxiety, and pain within severe or profound intellectual disabilities are also addressed. Finally, this review will examine methods of assessing internal states within genetic syndromes, including hunger, social anxiety and happiness within Prader-Willi, Fragile-X and Angelman syndrome. This will then allow for the identification of robust methodologies used in assessing the expression of these internal states, some of which may be useful when considering how to assess emotions within individuals with intellectual disabilities

    How major depressive disorder affects the ability to decode multimodal dynamic emotional stimuli

    Get PDF
    Most studies investigating the processing of emotions in depressed patients reported impairments in the decoding of negative emotions. However, these studies adopted static stimuli (mostly stereotypical facial expressions corresponding to basic emotions) which do not reflect the way people experience emotions in everyday life. For this reason, this work proposes to investigate the decoding of emotional expressions in patients affected by Recurrent Major Depressive Disorder (RMDDs) using dynamic audio/video stimuli. RMDDs’ performance is compared with the performance of patients with Adjustment Disorder with Depressed Mood (ADs) and healthy (HCs) subjects. The experiments involve 27 RMDDs (16 with acute depression - RMDD-A, and 11 in a compensation phase - RMDD-C), 16 ADs and 16 HCs. The ability to decode emotional expressions is assessed through an emotion recognition task based on short audio (without video), video (without audio) and audio/video clips. The results show that AD patients are significantly less accurate than HCs in decoding fear, anger, happiness, surprise and sadness. RMDD-As with acute depression are significantly less accurate than HCs in decoding happiness, sadness and surprise. Finally, no significant differences were found between HCs and RMDD-Cs in a compensation phase. The different communication channels and the types of emotion play a significant role in limiting the decoding accuracy

    Plug-in to fear: game biosensors and negative physiological responses to music

    Get PDF
    The games industry is beginning to embark on an ambitious journey into the world of biometric gaming in search of more exciting and immersive gaming experiences. Whether or not biometric game technologies hold the key to unlock the “ultimate gaming experience” hinges not only on technological advancements alone but also on the game industry’s understanding of physiological responses to stimuli of different kinds, and its ability to interpret physiological data in terms of indicative meaning. With reference to horror genre games and music in particular, this article reviews some of the scientific literature relating to specific physiological responses induced by “fearful” or “unpleasant” musical stimuli, and considers some of the challenges facing the games industry in its quest for the ultimate “plugged-in” experience

    Linking recorded data with emotive and adaptive computing in an eHealth environment

    Get PDF
    Telecare, and particularly lifestyle monitoring, currently relies on the ability to detect and respond to changes in individual behaviour using data derived from sensors around the home. This means that a significant aspect of behaviour, that of an individuals emotional state, is not accounted for in reaching a conclusion as to the form of response required. The linked concepts of emotive and adaptive computing offer an opportunity to include information about emotional state and the paper considers how current developments in this area have the potential to be integrated within telecare and other areas of eHealth. In doing so, it looks at the development of and current state of the art of both emotive and adaptive computing, including its conceptual background, and places them into an overall eHealth context for application and development

    Effects of aging on identifying emotions conveyed by point-light walkers

    Get PDF
    M.G. was supported by EC FP7 HBP (grant 604102), PITN-GA-011-290011 (ABC) FP7-ICT-2013-10/ 611909 (KOROIBOT), and by GI 305/4-1 and KA 1258/15-1, and BMBF, FKZ: 01GQ1002A. K.S.P. was supported by a BBSRC New Investigator Grant. A.B.S. and P.J.B. were supported by an operating grant (528206) from the Canadian Institutes for Health Research. The authors also thank Donna Waxman for her valuable help in data collection for all experiments described here.Peer reviewedPostprin

    Recognizing Emotions in a Foreign Language

    Get PDF
    Expressions of basic emotions (joy, sadness, anger, fear, disgust) can be recognized pan-culturally from the face and it is assumed that these emotions can be recognized from a speaker's voice, regardless of an individual's culture or linguistic ability. Here, we compared how monolingual speakers of Argentine Spanish recognize basic emotions from pseudo-utterances ("nonsense speech") produced in their native language and in three foreign languages (English, German, Arabic). Results indicated that vocal expressions of basic emotions could be decoded in each language condition at accuracy levels exceeding chance, although Spanish listeners performed significantly better overall in their native language ("in-group advantage"). Our findings argue that the ability to understand vocally-expressed emotions in speech is partly independent of linguistic ability and involves universal principles, although this ability is also shaped by linguistic and cultural variables
    • 

    corecore