11 research outputs found

    Extrafoveal capture of attention by emotional scenes: affective valence versus visual saliency

    No full text
    Pairs of emotional (pleasant or unpleasant) and neutral scenes were presented peripherally (5 degrees away from fixation) during a central letter-discrimination task. Selective attentional capture was assessed by means of eye movement orienting, i.e., probability of first fixating a scene and the time until first fixation. Static and dynamic visual saliency values of the scenes were computationally modelled. Results revealed selective orienting to both pleasant and unpleasant relative to neutral scenes. Importantly, such effects remained in the absence of visual saliency differences, even though saliency influenced eye movements. This suggests that selective attention to emotional scenes is genuinely driven by the processing of affective significance in extrafoveal vision

    Selective orienting to pleasant versus unpleasant visual scenes

    No full text
    We investigated the relative attentional capture by positive versus simultaneously presented negative images in extrafoveal vision for female observers. Pairs of task-irrelevant pleasant and unpleasant visual scenes were displayed peripherally (>= 5 degrees away from fixation) during a task-relevant letter-discrimination task at fixation. Selective attentional orienting was assessed by the probability of first fixating each scene and the time until first fixation. Results revealed a higher first fixation probability and shorter entry times, followed by longer dwell times, for pleasant relative to unpleasant scenes. The attentional capture advantage by pleasant scenes occurred in the absence of differences in perceptual properties. Processing of affective scene significance automatically occurs through covert attention in peripheral vision early. At least in non-threatening conditions, the attentional system is tuned to initially orient to pleasant images when competing with unpleasant ones. (C) 2016 Elsevier B.V. All rights reserved

    The contribution of facial regions to judgements of happiness and trustworthiness from dynamic expressions

    No full text
    What expressive facial features and processing mechanisms make a person look trustworthy, relative to happy? Participants judged the un/happiness or un/trustworthiness of people with dynamic expressions in which the eyes and/or the mouth unfolded from neutral to happy or vice versa. Faces with an unfolding smile looked more trustworthy and happier than faces with a neutral mouth, regardless of the eye expression. Unfolding happy eyes increased both trustworthiness and happiness only in the presence of a congruent unfolding smiling mouth. Nevertheless, the contribution of the mouth was greater for happiness than for trustworthiness; and the mouth was especially visually salient for expressions favouring happiness more than trustworthiness. We conclude that the categorisation of facial happiness is more automatically driven by the visual saliency of a single feature, that is, the smiling mouth, while perception of trustworthiness is more strategic, with the eyes being necessarily incorporated into a configural face representation

    Visual attention mechanisms in happiness versus trustworthiness processing of facial expressions

    No full text
    A happy facial expression makes a person look (more) trustworthy. Do perceptions of happiness and trustworthiness rely on the same face regions and visual attention processes? In an eye-tracking study, eye movements and fixations were recorded while participants judged the un/happiness or the un/trustworthiness of dynamic facial expressions in which the eyes and/or the mouth unfolded from neutral to happy or vice versa. A smiling mouth and happy eyes enhanced perceived happiness and trustworthiness similarly, with a greater contribution of the smile relative to the eyes. This comparable judgement output for happiness and trustworthiness was reached through shared as well as distinct attentional mechanisms: (a) entry times and (b) initial fixation thresholds for each face region were equivalent for both judgements, thereby revealing the same attentional orienting in happiness and trustworthiness processing. However, (c) greater and (d) longer fixation density for the mouth region in the happiness task, and for the eye region in the trustworthiness task, demonstrated different selective attentional engagement. Relatedly, (e) mean fixation duration across face regions was longer in the trustworthiness task, thus showing increased attentional intensity or processing effort

    Time course of selective attention to face regions in social anxiety: eye-tracking and computational modelling

    No full text
    We investigated the time course of selective attention to face regions during judgment of dis/approval by low (LSA) and high (HSA) social anxiety undergraduates (with clinical levels on questionnaire measures). The viewers’ gaze direction was assessed and the stimulus visual saliency of face regions was computed, for video-clips displaying dynamic facial expressions. Social anxiety was related to perception of disapproval from faces with an ambiguous smile (i.e. with non-happy eyes), but not those with congruent happy eyes and a smile. HSA observers selectively looked earlier at the eye region, whereas LSA ones preferentially looked at the smiling mouth. Consistently, gaze allocation was less related to visual saliency of the smile for HSA than for LSA viewers. The attentional bias towards the less salient eye region–thus opposing the automatic capture by the smile–suggests that it is strategically driven in HSA individuals, possibly aimed at detecting negative evaluators

    Recognition Thresholds for Static and Dynamic Emotional Faces

    No full text
    We investigated the minimum expressive intensity that is required to recognize (above chance) static and dynamic facial expressions of happiness, sadness, anger, disgust, fear, and surprise. To this end, we varied the degree of intensity of emotional expressions unfolding from a neutral face, by means of graphics morphing software. The resulting face stimuli (photographs and short videos) were presented in an expression categorization task for 1 s each, and measures of sensitivity or discrimination (A') were collected to establish thresholds. A number of physical, perceptual, categorical, and affective controls were performed. All six basic emotions were reliably recognized above chance level from low intensities, although recognition thresholds varied for different expressions: 20% of intensity, for happiness; 40%, for sadness, surprise, anger, and disgust; and 50%, for fear. The advantage of happy faces may be due to their greater physical change in facial features (as shown by automated facial expression measurement), also at low levels of intensity, relative to neutral faces. Recognition thresholds and the pattern of confusions across expressions were, nevertheless, equivalent for dynamic and static expressions, although dynamic expressions were recognized more accurately and faster

    Confianza en una sonrisa en función de los cambios en la expresión de los ojos

    No full text
    Antecedentes: confi ar en otras personas es necesario para una relación satisfactoria. La percepción de confi anza por parte del observador en otra persona aumenta cuando la cara expresa alegría. Investigamos en qué medida la confi anza depende de la expresión de los ojos en caras con una sonrisa. Método: presentamos vídeo-clips de expresiones dinámicas, con diferentes combinaciones de la expresión en la boca (sonrisa o neutra) y los ojos (alegres, neutros, de sorpresa, tristeza, miedo, asco y enfado). Los participantes juzgaban cuán contenta o de fi ar parecía la persona observada. Resultados: tanto los juicios de alegría como los de confi anza, y sus tiempos de reacción, variaron en función de pequeños cambios en los ojos, y de la naturaleza de éstos; los de enfado produjeron las mayores reducciones en alegría y confi anza. Conclusiones: la percepción de alegría es más dependiente de una boca sonriente, mientras que la percepción de confi anza depende más de la expresión en los ojos. La desconfi anza aumenta especialmente por la incongruencia entre ojos y boca.Background: Trusting other people is necessary for satisfactory and successful social interaction. A person’s perceived trustworthiness is related to perceived facial happiness. We investigated how trustworthy someone with a smiling face looks depending on changes in eye expression. Method: Video-clips of dynamic expressions were presented, with different combinations of the mouth (smiling vs. neutral) and the eyes (happy, neutral, surprised, sad, fearful, disgusted, or angry). Participants judged how happy (happiness task) or trustworthy (trustworthiness task) the expressers were. Results: Both happiness and trustworthiness judgments and reaction times varied as a function of small changes from happy to non-happy eyes in a smiling face, and depending on the specifi c nature of the eye expression, with angry eyes being particularly detrimental. Conclusions: Perception of facial happiness is more dependent on the smiling mouth, whereas trustworthiness relies more on eye expression. Judgments of untrustworthiness are especially sensitive to incongruence between the eyes and the mouth

    Selective eye fixations on diagnostic face regions of dynamic emotional expressions: KDEF-dyn database

    No full text
    Prior research using static facial stimuli (photographs) has identified diagnostic face regions (i.e., functional for recognition) of emotional expressions. In the current study, we aimed to determine attentional orienting, engagement, and time course of fixation on diagnostic regions. To this end, we assessed the eye movements of observers inspecting dynamic expressions that changed from a neutral to an emotional face. A new stimulus set (KDEF-dyn) was developed, which comprises 240 video-clips of 40 human models portraying six basic emotions (happy, sad, angry, fearful, disgusted, and surprised). For validation purposes, 72 observers categorized the expressions while gaze behavior was measured (probability of first fixation, entry time, gaze duration, and number of fixations). Specific visual scanpath profiles characterized each emotional expression: The eye region was looked at earlier and longer for angry and sad faces; the mouth region, for happy faces; and the nose/cheek region, for disgusted faces; the eye and the mouth regions attracted attention in a more balanced manner for surprise and fear. These profiles reflected enhanced selective attention to expression-specific diagnostic face regions. The KDEF-dyn stimuli and the validation data will be available to the scientific community as a useful tool for research on emotional facial expression processing

    Adaptive attunement of selective covert attention to evolutionary-relevant emotional visual scenes

    No full text
    We investigated selective attention to emotional scenes in peripheral vision, as a function of adaptive relevance of scene affective content for male and female observers. Pairs of emotional neutral images appeared peripherally with perceptual stimulus differences controlled while viewers were fixating on a different stimulus in central vision. Early selective orienting was assessed by the probability of directing the first fixation towards either scene, and the time until first fixation. Emotional scenes selectively captured covert attention even when they were task irrelevant, thus revealing involuntary, automatic processing. Sex of observers and specific emotional scene content (e.g., male-to-female-aggression, families and babies, etc.) interactively modulated covert attention, depending on adaptive priorities and goals for each sex, both for pleasant and unpleasant content. The attentional system exhibits domain-specific and sex-specific biases and attunements, probably rooted in evolutionary pressures to enhance reproductive and protective success. Emotional cues selectively capture covert attention based on their bio-social significance

    Selective gaze direction and interpretation of facial expressions in social anxiety

    No full text
    Fear of negative evaluation is the hallmark of social anxiety. We examined the hypothesis that, to facilitate detection of negative evaluators, an anticipatory coping strategy in social anxiety involves selective early gazing at the eyes of other people. Eye fixations were assessed while participants watched video-clips displaying dynamic facial expressions with prototypical (happy eyes and a smile) or ambiguous (a smile but non-happy eyes) smiling faces. High socially anxious (HSA) undergraduates with clinical levels of anxiety on questionnaire measures and low-anxious controls (LSA) judged expressers' un/trustworthiness (Experiment 1) or un/familiarity (Experiment 2) of expressions. Social anxiety was especially associated with reduced trustworthiness evaluation (interpretative bias) of ambiguous-but not of unambiguous-smiling faces. Further, HSA viewers mistrusted faces with novel, unfamiliar expressions more than LSA viewers did. Thus, the interpretative bias for ambiguous expressions could be due to their being unfamiliar. Importantly, HSA viewers selectively looked earlier at the eye region (attentional bias), whereas LSA viewers preferentially looked at the smiling mouth. Presumably, the early attention to the eyes by HSA individuals enhances detection of expressive incongruences, thus leading to untrustworthiness judgments. These biases are functional, in that they would facilitate recognition of untrustworthy expressers (e.g., with fake smiles)
    corecore