2,736 research outputs found

    Look at Me: Early Gaze Engagement Enhances Corticospinal Excitability During Action Observation

    Get PDF
    Direct gaze is a powerful social cue able to capture the onlooker's attention. Beside gaze, head and limb movements as well can provide relevant sources of information for social interaction. This study investigated the joint role of direct gaze and hand gestures on onlookers corticospinal excitability (CE). In two experiments we manipulated the temporal and spatial aspects of observed gaze and hand behavior to assess their role in affecting motor preparation. To do this, transcranial magnetic stimulation (TMS) on the primary motor cortex (M1) coupled with electromyography (EMG) recording was used in two experiments. In the crucial manipulation, we showed to participants four video clips of an actor who initially displayed eye contact while starting a social request gesture, and then completed the action while directing his gaze toward a salient object for the interaction. This way, the observed gaze potentially expressed the intention to interact. Eye tracking data confirmed that gaze manipulation was effective in drawing observers' attention to the actor's hand gesture. In the attempt to reveal possible time-locked modulations, we tracked CE at the onset and offset of the request gesture. Neurophysiological results showed an early CE modulation when the actor was about to start the request gesture looking straight to the participants, compared to when his gaze was averted from the gesture. This effect was time-locked to the kinematics of the actor's arm movement. Overall, data from the two experiments seem to indicate that the joint contribution of direct gaze and precocious kinematic information, gained while a request gesture is on the verge of beginning, increases the subjective experience of involvement and allows observers to prepare for an appropriate social interaction. On the contrary, the separation of gaze cues and body kinematics can have adverse effects on social motor preparation. CE is highly susceptible to biological cues, such as averted gaze, which is able to automatically capture and divert observer's attention. This point to the existence of heuristics based on early action and gaze cues that would allow observers to interact appropriately

    MetaSpace II: Object and full-body tracking for interaction and navigation in social VR

    Full text link
    MetaSpace II (MS2) is a social Virtual Reality (VR) system where multiple users can not only see and hear but also interact with each other, grasp and manipulate objects, walk around in space, and get tactile feedback. MS2 allows walking in physical space by tracking each user's skeleton in real-time and allows users to feel by employing passive haptics i.e., when users touch or manipulate an object in the virtual world, they simultaneously also touch or manipulate a corresponding object in the physical world. To enable these elements in VR, MS2 creates a correspondence in spatial layout and object placement by building the virtual world on top of a 3D scan of the real world. Through the association between the real and virtual world, users are able to walk freely while wearing a head-mounted device, avoid obstacles like walls and furniture, and interact with people and objects. Most current virtual reality (VR) environments are designed for a single user experience where interactions with virtual objects are mediated by hand-held input devices or hand gestures. Additionally, users are only shown a representation of their hands in VR floating in front of the camera as seen from a first person perspective. We believe, representing each user as a full-body avatar that is controlled by natural movements of the person in the real world (see Figure 1d), can greatly enhance believability and a user's sense immersion in VR.Comment: 10 pages, 9 figures. Video: http://living.media.mit.edu/projects/metaspace-ii

    Affording illusions? Natural Information and the Problem of Misperception

    Get PDF
    There are two related points at which J.J. Gibson’s ecological theory of visual perception remains remarkably underspecified: Firstly, the notion of information for perception is not explicated in much detail beyond the claim that it “specifies” the environment for perception, and, thus being an objective affair, enables an organism to perceive action possibilities or “affordances.” Secondly, misperceptions of affordances and perceptual illusions are not clearly distinguished from each other. Although the first claim seems to suggest that any perceptual illusion amounts to the misperception of affordances, there might be some relevant differences between various ways of getting things wrong. In this essay, Gibson’s notion of “specifying” information shall be reconstructed along the lines of Fred Dretske’s relational theory of information. This refined notion of information for perception will then be used to carve out the distinction between perceptual illusions and the misperception of affordances, with some help from the “Empirical Strategy” (developed by Purves et al.). It will be maintained that there are cases where perceptual illusions actually help an organism to correctly perceive an affordance. In such cases, the prima facie misrendered informational relations involved are kept intact by a set of appropriate transformation rules. Two of Gibson’s intuitions shall thus be preserved: the objectivity of informational relations and the empowerment of the organism as an active perceiver who uses those objective relations to his specific ends
    • …
    corecore