13,313 research outputs found

    Change blindness: eradication of gestalt strategies

    Get PDF
    Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task

    Function and Dysfunction in Distinct Facets of Empathy

    Get PDF
    Empathy is crucial for successful social interactions and it is impaired in many devastating disorders. Empathy deficits are highly burdensome for affected individuals, caregivers, and significant others, and costly for society as a whole. However, empathy is thought to be a multifaceted construct, including cognitive empathy, affective sharing, and empathic concern components. These constituents may be linked to different behavioural outcomes and neurocognitive substrates, and presentation varies depending on the facets affected. Thus, there is a critical need to determine the behavioural and neurocognitive substrates of different components of empathic responding, and how these are affected in particular disorders. The present work aimed to elucidate the nature of different components of empathy and how they vary as a function of clinical diagnoses and individual differences in subclinical traits, as well as their underlying functional neural mechanisms. Study I used the Multifaceted Empathy Test, a performance-based task tapping cognitive empathy, affective sharing, and empathic concern elicited by realistic emotional images, in patients with behavioural variant frontotemporal dementia (bvFTD). This revealed a global cognitive empathy deficit, deficient affective sharing for negative experiences, and a generalized processing impairment for negative stimuli in bvFTD. In Study II, healthy adults completed the Multifaceted Empathy Test and questionnaire measures of autistic traits, coldhearted psychopathic traits, and trait anxiety. Coldhearted traits were found to disrupt affective sharing and empathic concern, whereas trait anxiety appeared to influence subjective affective experience via generalized arousal. Study III investigated the involvement of action-perception matching, simulation mechanisms in cognitive versus emotional empathy, using fMRI during cognitive empathy, emotional empathy, and simulation network localizer tasks in healthy adults. Increased activation was observed in identified simulation regions during emotional versus cognitive empathy, providing evidence for greater involvement of simulation mechanisms in emotional empathy. Taken together, this work suggests that cognitive empathy, and emotional empathy, including affective sharing and empathic concern, represent aspects of empathy that are distinguishable and differentially linked with certain patient populations, subclinical traits, and neurocognitive mechanisms. These findings are discussed with respect to the nature and conceptualization of empathy and its components, as well as implications for disorders featuring empathy dysfunction

    Facial Expression Recognition

    Get PDF

    Emotion Regulation and Threat Estimation as Mediators of the Relation between Cognitive Functioning and Anxiety in Late Life

    Get PDF
    Background: Rates of anxiety are generally thought to decline in typically aging older adults. Some theorize that this decline is a result of age-related improvements in emotion regulation. Emotion regulation may require the use of complex cognitive processes, however, which can be impacted by cognitive decline. Indeed, the prevalence of anxiety is high among older adults with cognitive impairment. The current study examined emotion regulation and threat perception as possible mediators in the relation between cognitive functioning and anxiety.;Methods: One hundred adults, aged 60 and older, were recruited from nursing homes, assisted living facilities, and the community. All were asked to complete a cognitive screening measure, along with measures of anxiety, emotion regulation, and threat perception. The relation between these variables was examined.;Results: Though cognitive impairment predicted anxiety level, neither emotion regulation nor threat perception mediated the relation.;Conclusions: The data suggest that emotion regulation and threat perception may rely on automatic processing, rather than effortful cognitive processing

    The effect of emotion intensity on time perception: a study with transcranial random noise stimulation

    Get PDF
    Emotional facial expressions provide cues for social interactions and emotional events can distort our sense of time. The present study investigates the effect of facial emotional stimuli of anger and sadness on time perception. Moreover, to investigate the causal role of the orbitofrontal cortex (OFC) in emotional recognition, we employed transcranial random noise stimulation (tRNS) over OFC and tested the effect on participants' emotional recognition as well as on time processing. Participants performed a timing task in which they were asked to categorize as "short" or "long" temporal intervals marked by images of people expressing anger, sad or neutral emotional facial expressions. In addition, they were asked to judge if the image presented was of a person expressing anger or sadness. The visual stimuli were facial emotional stimuli indicating anger or sadness with different degrees of intensity at high (80%), medium (60%) and low (40%) intensity, along with neutral emotional face stimuli. In the emotional recognition task, results showed that participants were faster and more accurate when emotional intensity was higher. Moreover, tRNS over OFC interfered with emotion recognition, which is in line with its proposed role in emotion recognition. In the timing task, participants overestimated the duration of angry facial expressions, although neither emotional intensity not OFC stimulation significantly modulated this effect. Conversely, as the emotional intensity increased, participants exhibited a greater tendency to overestimate the duration of sad faces in the sham condition. However, this tendency disappeared with tRNS. Taken together, our results are partially consistent with previous findings showing an overestimation effect of emotionally arousing stimuli, revealing the involvement of OFC in emotional distortions of time, which needs further investigation

    KEER2022

    Get PDF
    Avanttítol: KEER2022. DiversitiesDescripció del recurs: 25 juliol 202

    Audio-visual deep learning regression of apparent personality

    Get PDF
    Treballs Finals de Grau d'Enginyeria Informàtica, Facultat de Matemàtiques, Universitat de Barcelona, Any: 2020, Director: Sergio Escalera Guerrero, Cristina Palmero Cantariño i Julio Jacques Junior[en] Personality perception is based on the relationship of the human being with the individuals of his surroundings. This kind of perception allows to obtain conclusions based on the analysis and interpretation of the observable, mainly face expressions, tone of voice and other nonverbal signals, allowing the construction of an apparent personality (or first impression) of people. Apparent personality (or first impressions) are subjective, and subjectivity is an inherent property of perception based exclusively on the point of view of each individual. In this project, we approximate such subjectivity using a multi-modal deep neural network with audiovisual signals as input and a late fusion strategy of handcrafted features, achieving accurate results. The aim of this work is to perform an analysis of the influence of automatic prediction for apparent personality (based on the Big-Five model), of the following characteristics: raw audio, visual information (sequence of face images) and high-level features, including Ekman's universal basic emotions, gender and age. To this end, we have defined different modalities, performing combinations of them and determining how much they contribute to the regression of apparent personality traits. The most remarkable results obtained through the experiments performed are as follows: in all modalities, females have a higher average accuracy than men, except in the modality with only audio; for happy emotion, the best accuracy score is found in the Conscientiousness trait; Extraversion and Conscientiousness traits get the highest accuracy scores in almost all emotions; visual information is the one that most positively influences the results; the combination of high-level features chosen slightly improves the accuracy performance for predictions

    Facial expression of pain: an evolutionary account.

    Get PDF
    This paper proposes that human expression of pain in the presence or absence of caregivers, and the detection of pain by observers, arises from evolved propensities. The function of pain is to demand attention and prioritise escape, recovery, and healing; where others can help achieve these goals, effective communication of pain is required. Evidence is reviewed of a distinct and specific facial expression of pain from infancy to old age, consistent across stimuli, and recognizable as pain by observers. Voluntary control over amplitude is incomplete, and observers can better detect pain that the individual attempts to suppress rather than amplify or simulate. In many clinical and experimental settings, the facial expression of pain is incorporated with verbal and nonverbal vocal activity, posture, and movement in an overall category of pain behaviour. This is assumed by clinicians to be under operant control of social contingencies such as sympathy, caregiving, and practical help; thus, strong facial expression is presumed to constitute and attempt to manipulate these contingencies by amplification of the normal expression. Operant formulations support skepticism about the presence or extent of pain, judgments of malingering, and sometimes the withholding of caregiving and help. To the extent that pain expression is influenced by environmental contingencies, however, "amplification" could equally plausibly constitute the release of suppression according to evolved contingent propensities that guide behaviour. Pain has been largely neglected in the evolutionary literature and the literature on expression of emotion, but an evolutionary account can generate improved assessment of pain and reactions to it
    • …
    corecore