750 research outputs found

    On reading texts

    Get PDF

    Between college and work in the Further Education and Training College sector

    Get PDF
    Students studying Civil Engineering (CE) at the Further Education and Training (FET) colleges spend periods of time in the classroom and workshop as well as in the workplace during experiential learning. The overall purpose of education and training in the college sector is generally understood as preparing students for employability, and difficulties in colleges performing this role are well known. In this article, these difficulties are examined in a novel way. The everyday perspectives of lecturers and supervisors about student learning in their college programmes and their work experience are translated into more theoretical language, using activity theory. A theoretical argument is made, which suggests that different sites of learning create different purposes, and that these different purposes derive from a distinction between knowledge and practice, which in turn has historical roots. The study concludes by suggesting that a new, common object of integrating theory and practice at all the sites would better link the college and workplace education and training systems, and tentatively suggests how this new object could be put into practice.Key words: activity theory; civil engineering; further education and training; theory and practic

    Auditory and visual capture during focused visual attention

    Get PDF
    It is well known that auditory and visual onsets presented at a particular location can capture a person’s visual attention. However, the question of whether such attentional capture disappears when attention is focused endogenously beforehand has not yet been answered. Moreover, previous studies have not differentiated between capture by onsets presented at a nontarget (invalid) location and possible performance benefits occurring when the target location is (validly) cued. In this study, the authors modulated the degree of attentional focus by presenting endogenous cues with varying reliability and by displaying placeholders indicating the precise areas where the target stimuli could occur. By using not only valid and invalid exogenous cues but also neutral cues that provide temporal but no spatial information, they found performance benefits as well as costs when attention is not strongly focused. The benefits disappear when the attentional focus is increased. These results indicate that there is bottom-up capture of visual attention by irrelevant auditory and visual stimuli that cannot be suppressed by top-down attentional control

    Publications received by the regional editor (from Jan 2010 to Dec 2011)

    Full text link
    Publications in Indian Studies

    Priming T2 in a visual and auditory attentional blink task

    Get PDF
    Participants performed an attentional blink (AB) task including digits as targets and letters as distractors within the visual and auditory domains. Prior to the rapid serial visual presentation, a visual or auditory prime was presented in the form of a digit that was identical to the second target (T2) on 50% of the trials. In addition to the "classic" AB effect, an overall drop in performance on T2 was observed for the trials on which the stream was preceded by an identical prime from the same modality. No cross-modal priming was evident, suggesting that the observed inhibitory priming effects are modality specific. We argue that the present findings represent a special type of negative priming operating at a low feature level. Copyright 2008 Psychonomic Society, Inc

    Pip and pop: Nonspatial auditory signals improve spatial visual search

    Get PDF
    Searching for an object within a cluttered, continuously changing environment can be a very time-consuming process. The authors show that a simple auditory pip drastically decreases search times for a synchronized visual object that is normally very difficult to find. This effect occurs even though the pip contains no information on the location or identity of the visual object. The experiments also show that the effect is not due to general alerting (because it does not occur with visual cues), nor is it due to top-down cuing of the visual change (because it still occurs when the pip is synchronized with distractors on the majority of trials). Instead, we propose that the temporal information of the auditory signal is integrated with the visual signal, generating a relatively salient emergent feature that automatically draws attention. Phenomenally, the synchronous pip makes the visual object pop out from its complex environment, providing a direct demonstration of spatially nonspecific sounds affecting competition in spatial visual processing. Keywords: attention, visual search, multisensory integration, audition, visio
    corecore