16 research outputs found
The role of emotion in the linguistic and pragmatic aspects of aphasic performance
Considerations of aphasics' performance typically focus on aspects of linguistic impairment. Similarly, researchers tend to emphasize right braindamaged subjects' relatively poor performance in response to emotional content or context. The spared or heightened emotional abilities of aphasic communication often go unnoticed. Research will be reviewed which suggests that aphasics have the ability to successfully utilize emotion in the comprehension and expression of both linguistic and pragmatic content and contexts. Evidence from a wide range of research on lexical processing, prosody, and discourse will be reviewed which indicates that emotion may play a facilitatory role in the comprehension and production of communication in language-impaired people. A large group study involving 15 left brain-damaged, 12 right brain-damaged and 16 normal controls was carried out to investigate posed and spontaneous emotional expression and perception, including the vocal and verbal, as well as facial, channels for spontaneous expression. Results will be considered with respect to the neuropsychological organization of linguistic and emotional cognitive systems
Channels of emotional expression in patients with unilateral brain damage
The contribution of facial, intonational, and speech channels to spontaneous emotional expression was examined in right brain-damaged (RBD), left braindamaged (LBD), and normal control (NC) subjects. Subjects were videotaped while viewing and responding to a series of emotionally laden slides; the videotapes were then rated for the three channels of communication. Overall, RBDs used facial expression and intonation less frequently than the other two groups. When the speech output channel was analyzed, oral expression of feelings in the RBDs, relative to the LBDs and NCs, was less appropriate, more propositional than prosodic, and more descriptive than affective. When the ratings for the three channels of communication were examined, facial expression and intonation were significantly correlated for all subjects
Effect of emotional context on bucco-facial apraxia
Patients with left- and right-hemisphere cerebrovascular pathology and normal adult controls were videotaped while executing tasks of bucco-facial praxis in emotional and nonemotional conditions. Each practic movement was assessed for accuracy and motor execution. Left-brain-damaged patients were significantly impaired on these tasks relative to right-damaged patients and controls. When emotional context was provided, apractic performance improved significantly
Emotional and non-emotional facial behaviour in patients with unilateral brain damage
Aspects of emotional facial expression (responsivity, appropriateness, intensity) were examined in brain-damaged adults with right or left hemisphere cerebrovascular lesions and in normal controls. Subjects were videotaped during experimental procedures designed to elicit emotional facial expression and non-emotional facial movement (paralysis, mobility, praxis). On tasks of emotional facial expression, patients with right hemisphere pathology were less responsive and less appropriate than patients with left hemisphere pathology or normal controls. These results corroborate other research findings that the right cerebral hemisphere is dominant for the expression of facial emotion. Both brain-damaged groups had substantial facial paralysis and impairment in muscular mobility on the hemiface contralateral to site of lesion, and the left brain-damaged group had bucco-facial apraxia. Performance measures of emotional expression and non-emotional movement were uncorrelated, suggesting a dissociation between these two systems of facial behaviour
Conveying Real-Time Ambivalent Feelings through Asymmetric Facial Expressions
Achieving effective facial emotional expressivity within a real-time rendering constraint requests to leverage on all possible inspiration sources and especially from the observations of real individuals. One of them is the frequent asymmetry of facial expressions of emotions, which allows to express complex emotional feelings such as suspicion, smirk, and hidden emotion due to social conventions. To achieve such a higher degree of facial expression, we propose a new model for mapping emotions onto a small set of 1D Facial Part Actions (FPA)s that act on antagonist muscle groups or on individual head orientation degree of freedoms. The proposed linear model can automatically drive a large number of autonomous virtual humans or support the interactive design of complex facial expressions over time