14 research outputs found

    Social Context and Tool Use Can Modulate Interpersonal Comfort Space

    No full text
    Recent research has investigated whether the representation of space around the body, in terms of reach–action (imagining of reaching another person) and comfort–social (tolerance of the other’s proximity) spaces, may reflect a shared sensorimotor basis. Some studies exploiting motor plasticity induced by tool use have not observed sensorimotor identity (i.e., the same mechanisms that underlie, based on sensory information, the representation of proximal space in terms of action possibilities, goal-directed motor actions, and anticipation of the sensorimotor consequences), whereas evidence to the contrary has also emerged. Since the data are not fully convergent, here we wondered whether or not the combination of motor plasticity induced by tool use and the processing of the role of social context might reflect a similar modulation in both spaces. To this end, we conducted a randomized control trial with three groups of participants (N = 62) in which reaching and comfort distances were measured in Pre- and Post-tool-use sessions. The tool-use sessions were conducted under different conditions: (i) in the presence of a social stimulus (determining the social context) (Tool plus Mannequin group); (ii) without any stimulus (Only Tool group); (iii) in the presence of a box (Tool plus Object group) as a control condition. Results showed an extension of comfort distance in the Post-tool session of the Tool plus Mannequin group compared with the other conditions. Conversely, the reaching distance was larger after tool use than at the Pre-tool-use session, independently of the experimental conditions. Our findings suggest that motor plasticity impacts reaching and comfort spaces to different degrees; while reaching space is markedly sensitive to motor plasticity, comfort space needs qualification of social context information

    Interaction with Virtual Humans and Effect of Emotional Expressions: Anger Matters!

    No full text
    Today we are experiencing a hybrid real-virtual society in which the interaction with virtual humans is normal and “quasi-social”. Understanding the way we react to the interaction with virtual agents and the impact of emotions on social dynamics in the virtual world is fundamental. Therefore, in this study we investigated the implicit effect of emotional information by adopting a perceptual discrimination task. Specifically, we devised a task that explicitly required perceptual discrimination of a target while involving distance regulation in the presence of happy, neutral, or angry virtual agents. In two Immersive Virtual Reality experiments, participants were instructed to discriminate a target on the virtual agents’ t-shirts, and they had to provide the response by stopping the virtual agents (or themselves) at the distance where they could identify the target. Thus, facial expressions were completely irrelevant to the perceptual task. The results showed that the perceptual discrimination implied a longer response time when t-shirts were worn by angry rather than happy or neutral virtual agents. This suggests that angry faces interfered with the explicit perceptual task people had to perform. From a theoretical standpoint, this anger-superiority effect could reflect an ancestral fear/avoidance mechanism that prompts automatic defensive reactions and bypasses other cognitive processes

    The influence of emotions on spatial cognition: from route learning to egocentric and allocentric reference systems

    No full text
    Background: Emotions may influence how individuals represent spatial information. However, some studies show that positive emotions, but not negative emotions, improve spatial memory, while others show the opposite or no significant effect. Aim: Clarify the relationship between emotions and spatial cognition. Method: Study 1: 150 university students (Agemean: 20.50) learned a route containing emotional images (from IAPS) and then had to remember spatial information about the landmarks and the route. Study 2: 120 university students (Agemean: 21.72) memorized the location of triads of 3D geometric stimuli. Crucially, 40 participants viewed emotional images (from IAPS) before encoding, 40 participants during retention, and 40 participants before retrieval of spatial information. In all conditions, participants judged the distance of a stimulus relative to themselves (egocentric judgment) or relative to another stimulus (allocentric judgment) (i.e., Ego Allo task). Study 3: 40 university students (Agemean: 20.80) performed the Ego-Allo task with triads of emotional images (from IAPS). Study_4: 40 university students (Agemean: 21.45) performed the Ego-Allo task with geometric stimuli but in rooms that induced positive or negative moods. Results: Participants were more accurate in spatial judgments when stimuli and rooms had positive than negative valence. However, depending on the different processing stages of neutral stimuli, positive and negative images affected egocentric and allocentric spatial judgments differently. Conclusion: How emotions affect spatial cognition depends on the type of object (neutral vs. emotional), the role of the emotional event (distractor/target), and the environment in which spatial encoding occurs (pleasant vs. unpleasant)

    The Effect of Facial Expressions on Interpersonal Space: A Gender Study in Immersive Virtual Reality

    No full text
    In proxemics, the interpersonal space is the optimal social distance between individuals. Evidence has shown that the emotional facial expressions and gender-related effects can modulate this distance during social interactions. Typically, this distance increases in threatening situations and decreases in safe situations. Moreover, male dyads maintain larger distances than female dyads whereas the findings about mixed-sex dyads are still unclear. Virtual Reality (VR) based technologies are becoming more and more used in different areas of everyday life and, in the scientific field, for studying social phenomena. This raises the question of the degree of similarity of VR simulations to actual phenomena, i.e. its ecological validity and its effectiveness for applied purposes. In order to clarify gender-related and emotion-related effects and the ecological validity of VR, we investigated if real females and males differently modulated their interpersonal distance while male and female virtual confederates with happy, angry and neutral faces approached them. Results showed that participants preferred larger distances with both male and female virtual confederates who showed an angry face rather than neutral and happy. Moreover, males preferred a shorter distance, particularly when facing smiling virtual females, while females preferred a larger distance from angry virtual males. These results suggest that gender differences can affect the impact of emotional facial expression on the modulation of the interpersonal space. Moreover, they confirm previous behavioural studies and add further support to the ecological validity of IVR simulations and thus its potential effectiveness as a tool for research and application interventions
    corecore