45,839 research outputs found

    Cue-target contingencies modulate voluntary orienting of spatial attention: dissociable effects for speed and accuracy

    Get PDF
    Voluntary orienting of spatial attention is typically investigated by visually presented directional cues, which are called predictive when they indicate where the target is more likely to appear. In this study we investigated the nature of the potential link between cue predictivity (the proportion of valid trials) and the strength of the resulting, covert, orienting of attention. Participants judged the orientation of a unilateral Gabor grating preceded by a centrally-presented, non-directional, colour cue, arbitrarily prompting a leftwards or rightwards shift of attention. Unknown to them, cue predictivity was manipulated across blocks, whereby the cue was only predictive for either the first or the second half of the experiment. Our results show that the cueing effects were strongly influenced by the change in predictivity. This influence differently emerged in response speed and accuracy. The speed difference between valid and invalid trials was significantly larger when cues were predictive, and the amplitude of this effect was modulated at the single trial level by the recent trial history. Complementary to these findings, accuracy revealed a robust effect of block history and also a different time-course compared to speed, as if it mainly mirrored voluntary processes. These findings, obtained with a new manipulation and by using arbitrary non-directional cueing, demonstrate that cue-target contingencies strongly modulate the way attention deploys in space

    Differential Effects in Bimodal Directional Stroop Interference

    Full text link
    The directional Stroop task (e.g., Cannon, 1998) creates interference between a directional word and a directional cue, such as an arrow. This study was conducted to replicate directional Stroop interference using bimodal stimulus pairs and then to determine whether or not interference occurs when the word is replaced with a sound. In Experiment 1, an arrow, pointing up or down, was paired with a directional word (UP or DOWN). Subjects were faster responding to the direction of the arrow when the pairs were congruent compared to incongruent indicating interference. In Experiment 2, the visual word was replaced with a voice. Incongruent trials produced longer RTs but there was no statistical difference between conditions. In Experiment 3, the auditory word was replaced with the sound of a slide whistle either going up or going down. Although response times were longer for incongruent pairs and the effect size was moderate, there was no significant interference between the arrow and a direction-related sound. Experiment 4 utilized the same design as Experiment 3. However, in Experiment 4 subjects responded to the direction of the sound instead of the arrow. Performance across conditions was virtually identical indicating that the visual directional cue (i.e., the arrow) had no impact on identifying the direction of the sound. Together, the results replicate previous research with a visual directional task but did not extend these findings to auditory-visual cross-modal tasks. However, the initial results from Experiments 3 and 4 suggest that auditory cues may influence visual directional cues but that visual cues do not influence auditory directional cues

    Normal and impaired reflexive orienting of attention after central nonpredictive cues

    Get PDF
    Recent studies suggest that stimuli with directional meaning can trigger lateral shifts of visuospatial attention when centrally presented as noninformative cues. We investigated covert orienting in healthy participants and in a group of 17 right braindamaged patients (9 with hemispatial neglect) comparing arrows, eye gaze, and digits as central nonpredictive cues in a detection task. Orienting effects elicited by arrows and eye gaze were overall consistent in healthy participants and in right brain-damaged patients, whereas digit cues were ineffective. Moreover, patients with neglect showed, at the shortest delay between cue and target, a disengage deficit for arrow cueing whose magnitude was predicted by neglect severity. We conclude that the peculiar form of attentional orienting triggered by the directional meaning of arrow cues presents some features previously thought to characterize only the stimulus-driven (exogenous) orienting to noninformative peripheral cues

    Domain general learning: Infants use social and non-social cues when learning object statistics.

    Get PDF
    Previous research has shown that infants can learn from social cues. But is a social cue more effective at directing learning than a non-social cue? This study investigated whether 9-month-old infants (N = 55) could learn a visual statistical regularity in the presence of a distracting visual sequence when attention was directed by either a social cue (a person) or a non-social cue (a rectangle). The results show that both social and non-social cues can guide infants' attention to a visual shape sequence (and away from a distracting sequence). The social cue more effectively directed attention than the non-social cue during the familiarization phase, but the social cue did not result in significantly stronger learning than the non-social cue. The findings suggest that domain general attention mechanisms allow for the comparable learning seen in both conditions

    Formation of visual memories controlled by gamma power phase-locked to alpha oscillations

    Get PDF
    Neuronal oscillations provide a window for understanding the brain dynamics that organize the flow of information from sensory to memory areas. While it has been suggested that gamma power reflects feedforward processing and alpha oscillations feedback control, it remains unknown how these oscillations dynamically interact. Magnetoencephalography (MEG) data was acquired from healthy subjects who were cued to either remember or not remember presented pictures. Our analysis revealed that in anticipation of a picture to be remembered, alpha power decreased while the cross-frequency coupling between gamma power and alpha phase increased. A measure of directionality between alpha phase and gamma power predicted individual ability to encode memory: stronger control of alpha phase over gamma power was associated with better memory. These findings demonstrate that encoding of visual information is reflected by a state determined by the interaction between alpha and gamma activity

    Robot Navigation in Unseen Spaces using an Abstract Map

    Full text link
    Human navigation in built environments depends on symbolic spatial information which has unrealised potential to enhance robot navigation capabilities. Information sources such as labels, signs, maps, planners, spoken directions, and navigational gestures communicate a wealth of spatial information to the navigators of built environments; a wealth of information that robots typically ignore. We present a robot navigation system that uses the same symbolic spatial information employed by humans to purposefully navigate in unseen built environments with a level of performance comparable to humans. The navigation system uses a novel data structure called the abstract map to imagine malleable spatial models for unseen spaces from spatial symbols. Sensorimotor perceptions from a robot are then employed to provide purposeful navigation to symbolic goal locations in the unseen environment. We show how a dynamic system can be used to create malleable spatial models for the abstract map, and provide an open source implementation to encourage future work in the area of symbolic navigation. Symbolic navigation performance of humans and a robot is evaluated in a real-world built environment. The paper concludes with a qualitative analysis of human navigation strategies, providing further insights into how the symbolic navigation capabilities of robots in unseen built environments can be improved in the future.Comment: 15 pages, published in IEEE Transactions on Cognitive and Developmental Systems (http://doi.org/10.1109/TCDS.2020.2993855), see https://btalb.github.io/abstract_map/ for access to softwar

    Object-guided Spatial Attention in Touch: Holding the Same Object with Both Hands Delays Attentional Selection

    Get PDF
    Abstract Previous research has shown that attention to a specific location on a uniform visual object spreads throughout the entire object. Here we demonstrate that, similar to the visual system, spatial attention in touch can be object guided. We measured event-related brain potentials to tactile stimuli arising from objects held by observers' hands, when the hands were placed either near each other or far apart, holding two separate objects, or when they were far apart but holding a common object. Observers covertly oriented their attention to the left, to the right, or to both hands, following bilaterally presented tactile cues indicating likely tactile target location(s). Attentional modulations for tactile stimuli at attended compared to unattended locations were present in the time range of early somatosensory components only when the hands were far apart, but not when they were near. This was found to reflect enhanced somatosensory processing at attended locations rather than suppressed processing at unattended locations. Crucially, holding a common object with both hands delayed attentional selection, similar to when the hands were near. This shows that the proprioceptive distance effect on tactile attentional selection arises when distant event locations can be treated as separate and unconnected sources of tactile stimulation, but not when they form part of the same object. These findings suggest that, similar to visual attention, both space- and object-based attentional mechanisms can operate when we select between tactile events on our body surface.</jats:p

    Directional Sensitivity of Echolocation System in Bats Producing Frequency-Modulated Signals

    Get PDF
    1. Radiation patterns of the 55, 75 and 95 kHz components in frequency-modulated sounds emitted by the grey bat (Myotis grisescens) were studied. FM sounds similar to species-specific orientation sounds were elicited by electrical stimuli applied to the midbrain while the head of the animal was immobilized by a nail cemented to its skull. The main beam was emitted 5-10° downward from the eye-nostril line. The radiation angle at one half of maximum amplitude was 38° lateral, 18° up and 50° down at 55 kHz, 34° lateral, 8° up and 32° down at 75 kHz, and 30° lateral, 5° up and 25° down at 95 kHz. At 95 kHz, two prominent side lobes were present. 2. The directional sensitivity of the auditory system (DSA) measured in terms of the potential evoked in the lateral lemniscus was studied with the grey bat (M. grisescens) and the little brown bat (M. lucifugus). The maximally sensitive direction moved toward the median plane with the increase in frequency from 35-95 kHz. The slope of the DSA curve increased from 0.3-0.6 dB/degree with frequency. 3. The directional sensitivity of the echolocation system (DSE) was calculated using both the DSA curve and the radiation pattern of the emitted sound. The maximally sensitive direction of the echolocation system was 15° lateral to the median plane at 55kHz and 2.5° lateral at 95 kHz. The slope of the DSE curve increased from o.6 to 1.0 dB/degree with frequency. Thus, the higher the frequency of sound, the sharper was the directional sensitivity of the echolocation system. 4. The interaural pressure difference (IPD), which appeared to be the essential cue for echolocation in Myotis, changed linearly with the azimuth angle from 0-30° lateral regardless of the frequency of sound, at respective rates of 0.4, 0.7, 0.3 and 0.4 dB/degree for 35, 55, 75 and 95 kHz sounds. Beyond 30°, the change in IPD was quite different depending on frequency. For 75 and 95 kHz sounds, the IPD stayed nearly the same between 30° and 90°. Thus, the 75-95 kHz components in FM orientation sounds were not superior to the 35 and 55 kHz components in terms of the IPD cue for echolocation. 5. Assuming the just-detectable IPD and ITD to be 0.5 dB and 5µsec respectively, as in man, the just-detectable azimuth difference of Myotis around the median plane would be 0.7-1.7° with the IPD cue and 11° with the ITD cue

    The role of perspective taking on attention: a review of the special issue on the Reflexive Attentional Shift Phenomenon

    Get PDF
    Attention is a process that alters how cognitive resources are allocated, and it allows individuals to efficiently process information at the attended location. The presence of visual or auditory cues in the environment can direct the focus of attention towards certain stimuli; even if the cued stimuli are not the individual’s primary target. Samson et al. [1] demonstrated that seeing another person in the scene (i.e. a person-like cue) caused a delay in responding to target stimuli not visible to that person: “altercentric intrusion”. This phenomenon, they argue, is dependent upon the fact that the cue used resembled a person as opposed to a more generic directional indicator. The characteristics of the cue are the core of the debate of this special issue. Some maintain that the perceptual-directional characteristics of the cue are sufficient to generate the bias whilst others argue that the cuing is stronger when the cue has social characteristics (relates to what another individual can perceive). The research contained in this issue confirms that human attention is biased by the presence of a directional cue. We discuss and compare the different studies. The pattern that emerges seems to suggest that social relevance of the cue is necessary in some contexts but not in others, depending on the cognitive demand of the experimental task. One possibility is that the social mechanisms are involved in perspective taking when the task is cognitively demanding, whilst they may not play a role in automatic attention allocation
    corecore