444 research outputs found
Recognizing Emotions in a Foreign Language
Expressions of basic emotions (joy, sadness, anger, fear, disgust) can be recognized pan-culturally from the face and it is assumed that these emotions can be recognized from a speaker's voice, regardless of an individual's culture or linguistic ability. Here, we compared how monolingual speakers of Argentine Spanish recognize basic emotions from pseudo-utterances ("nonsense speech") produced in their native language and in three foreign languages (English, German, Arabic). Results indicated that vocal expressions of basic emotions could be decoded in each language condition at accuracy levels exceeding chance, although Spanish listeners performed significantly better overall in their native language ("in-group advantage"). Our findings argue that the ability to understand vocally-expressed emotions in speech is partly independent of linguistic ability and involves universal principles, although this ability is also shaped by linguistic and cultural variables
Reading Your Counterpart: The Benefit of Emotion Recognition Accuracy for Effectiveness in Negotiation
10.1007/s10919-007-0033-7Journal of Nonverbal Behavior314205-22
Human Perception of Fear in Dogs Varies According to Experience with Dogs
To investigate the role of experience in humans’ perception of emotion using canine visual signals, we asked adults with various levels of dog experience to interpret the emotions of dogs displayed in videos. The video stimuli had been pre-categorized by an expert panel of dog behavior professionals as showing examples of happy or fearful dog behavior. In a sample of 2,163 participants, the level of dog experience strongly predicted identification of fearful, but not of happy, emotional examples. The probability of selecting the “fearful” category to describe fearful examples increased with experience and ranged from.30 among those who had never lived with a dog to greater than.70 among dog professionals. In contrast, the probability of selecting the “happy” category to describe happy emotional examples varied little by experience, ranging from.90 to.93. In addition, the number of physical features of the dog that participants reported using for emotional interpretations increased with experience, and in particular, more-experienced respondents were more likely to attend to the ears. Lastly, more-experienced respondents provided lower difficulty and higher accuracy self-ratings than less-experienced respondents when interpreting both happy and fearful emotional examples. The human perception of emotion in other humans has previously been shown to be sensitive to individual differences in social experience, and the results of the current study extend the notion of experience-dependent processes from the intraspecific to the interspecific domain
Eyes Are Windows to the Chinese Soul: Evidence from the Detection of Real and Fake Smiles
How do people interpret the meaning of a smile? Previous studies with Westerners have found that both the eyes and the mouth are crucial in identifying and interpreting smiles, yet less is known about Easterners. Here we reported that when asking the Chinese to judge the Duchenne and non-Duchenne smiles as either real or fake, their accuracy and sensitivity were negatively correlated with their individualism scores but positively correlated with their collectivism scores. However, such correlations were found only for participants who stated the eyes to be the most useful references, but not for those who favored the mouth. Moreover, participants who favored the eyes were more accurate and sensitive than those who favored the mouth. Our results thus indicate that Chinese who follow the typical Eastern decoding process of using the eyes as diagnostic cues to identify and interpret others' facial expressions and social intentions, are particularly accurate and sensitive, the more they self-report greater collectivistic and lower individualistic values
The phosphomimetic mutation of syndecan-4 binds and inhibits Tiam1 modulating Rac1 activity in PDZ interaction-dependent manner
The small GTPases of the Rho family comprising RhoA, Rac1 and Cdc42 function as molecular switches controlling several essential biochemical pathways in eukaryotic cells. Their activity is cycling between an active GTP-bound and an inactive GDP-bound conformation. The exchange of GDP to GTP is catalyzed by guanine nucleotide exchange factors (GEFs). Here we report a novel regulatory mechanism of Rac1 activity, which is controlled by a phosphomimetic (Ser179Glu) mutant of syndecan-4 (SDC4). SDC4 is a ubiquitously expressed transmembrane, heparan sulfate proteoglycan. In this study we show that the Ser179Glu mutant binds strongly Tiam1, a Rac1-GEF reducing Rac1-GTP by 3-fold in MCF-7 breast adenocarcinoma cells. Mutational analysis unravels the PDZ interaction between SDC4 and Tiam1 is indispensable for the suppression of the Rac1 activity. Neither of the SDC4 interactions is effective alone to block the Rac1 activity, on the contrary, lack of either of interactions can increase the activity of Rac1, therefore the Rac1 activity is the resultant of the inhibitory and stimulatory effects. In addition, SDC4 can bind and tether RhoGDI1 (GDP-dissociation inhibitor 1) to the membrane. Expression of the phosphomimetic SDC4 results in the accumulation of the Rac1-RhoGDI1 complex. Co-immunoprecipitation assays (co-IP-s) reveal that SDC4 can form complexes with RhoGDI1. Together, the regulation of the basal activity of Rac1 is fine tuned and SDC4 is implicated in multiple ways
Recommended from our members
Bengali translation and characterisation of four cognitive and trait measures for autism spectrum conditions in India
Background
Autism is characterised by atypical social-communicative behaviour and restricted range of interests and repetitive behaviours. These features exist in a continuum in the general population. Behavioural measures validated across cultures and languages are required to quantify the dimensional traits of autism in these social and non-social domains. Bengali is the seventh most spoken language in the world. However, there is a serious dearth of data on standard measures of autism-related social and visual cognition in Bengali.
Methods
Bengali translations of two measures related to social-communicative functioning (the Children’s Reading the Mind in the Eyes Test (RMET) and a facial emotion recognition test with stimuli taken from the Karolinska Directed Emotional Faces database), one measure of visual perceptual disembedding (the Embedded Figures Test), and a questionnaire measure (the Children’s Empathy Quotient) were tested in 25 children with autism spectrum conditions (ASC) and 26 control children (mean age = 10.7 years) in Kolkata, India. Group differences were analysed by t test and multiple regression (after accounting for potential effects of gender, IQ, and age).
Results
Behavioural and trait measures were associated with group differences in the expected directions: ASC children scored lower on the Children’s Empathy Quotient and the RMET, as well as on facial emotion recognition, but were faster and more accurate on the Embedded Figures Test. Distributional properties of these measures within groups are similar to those reported in Western countries.
Conclusions
These results provide an empirical demonstration of cross-cultural generalisability and applicability of these standard behavioural and trait measures related to autism, in a major world language
On the Perception of Religious Group Membership from Faces
BACKGROUND: The study of social categorization has largely been confined to examining groups distinguished by perceptually obvious cues. Yet many ecologically important group distinctions are less clear, permitting insights into the general processes involved in person perception. Although religious group membership is thought to be perceptually ambiguous, folk beliefs suggest that Mormons and non-Mormons can be categorized from their appearance. We tested whether Mormons could be distinguished from non-Mormons and investigated the basis for this effect to gain insight to how subtle perceptual cues can support complex social categorizations. METHODOLOGY/PRINCIPAL FINDINGS: Participants categorized Mormons' and non-Mormons' faces or facial features according to their group membership. Individuals could distinguish between the two groups significantly better than chance guessing from their full faces and faces without hair, with eyes and mouth covered, without outer face shape, and inverted 180°; but not from isolated features (i.e., eyes, nose, or mouth). Perceivers' estimations of their accuracy did not match their actual accuracy. Exploration of the remaining features showed that Mormons and non-Mormons significantly differed in perceived health and that these perceptions were related to perceptions of skin quality, as demonstrated in a structural equation model representing the contributions of skin color and skin texture. Other judgments related to health (facial attractiveness, facial symmetry, and structural aspects related to body weight) did not differ between the two groups. Perceptions of health were also responsible for differences in perceived spirituality, explaining folk hypotheses that Mormons are distinct because they appear more spiritual than non-Mormons. CONCLUSIONS/SIGNIFICANCE: Subtle markers of group membership can influence how others are perceived and categorized. Perceptions of health from non-obvious and minimal cues distinguished individuals according to their religious group membership. These data illustrate how the non-conscious detection of very subtle differences in others' appearances supports cognitively complex judgments such as social categorization
Facial expression training optimises viewing strategy in children and adults
This study investigated whether training-related improvements in facial expression categorization are facilitated by spontaneous changes in gaze behaviour in adults and nine-year old children. Four sessions of a self-paced, free-viewing training task required participants to categorize happy, sad and fear expressions with varying intensities. No instructions about eye movements were given. Eye-movements were recorded in the first and fourth training session. New faces were introduced in session four to establish transfer-effects of learning. Adults focused most on the eyes in all sessions and increased expression categorization accuracy after training coincided with a strengthening of this eye-bias in gaze allocation. In children, training-related behavioural improvements coincided with an overall shift in gaze-focus towards the eyes (resulting in more adult-like gaze-distributions) and towards the mouth for happy faces in the second fixation. Gaze-distributions were not influenced by the expression intensity or by the introduction of new faces. It was proposed that training enhanced the use of a uniform, predominantly eyes-biased, gaze strategy in children in order to optimise extraction of relevant cues for discrimination between subtle facial expressions
- …