22 research outputs found

    Mind your step: the effects of mobile phone use on gaze behavior in stair climbing

    Get PDF
    Stair walking is a hazardous activity and a common cause of fatal and non-fatal falls. Previous studies have assessed the role of eye movements in stair walking by asking people to repeatedly go up and down stairs in quiet and controlled conditions, while the role of peripheral vision was examined by giving participants specific fixation instructions or working memory tasks. We here extend this research to stair walking in a natural environment with other people present on the stairs and a now common secondary task: Using one's mobile phone. Results show that using the mobile phone strongly draws one's attention away from the stairs, but that the distribution of gaze locations away from the phone is little influenced by using one's phone. Phone use also increased the time needed to walk the stairs, but handrail use remained low. These results indicate that limited foveal vision suffices for adequate stair walking in normal environments, but that mobile phone use has a strong influence on attention, which may pose problems when unexpected obstacles are encountered

    Comment on empirical evidence for the design of public lighting

    Get PDF
    A recent article (Peña-García et al., 2015) presented conclusions regarding the benefits of road lighting for pedestrians. Here it is demonstrated that those conclusions were drawn from incomplete evidence, in one case because the experimental designs leads only to a trivial solution and in a second case because of an incomplete search of the literature

    Appraising the intention of other people: Ecological validity and procedures for investigating effects of lighting for pedestrians

    Get PDF
    One of the aims of outdoor lighting public spaces such as pathways and subsidiary roads is to help pedestrians to evaluate the intentions of other people. This paper discusses how a pedestrians’ appraisal of another persons’ intentions in artificially lit outdoor environments can be studied. We review the visual cues that might be used, and the experimental design with which effects of changes in lighting could be investigated to best resemble the pedestrian experience in artificially lit urban environments. Proposals are made to establish appropriate operationalisation of the identified visual cues, choice of methods and measurements representing critical situations. It is concluded that the intentions of other people should be evaluated using facial emotion recognition; eye tracking data suggest a tendency to make these observations at an interpersonal distance of 15 m and for a duration of 500 ms. Photographs are considered suitable for evaluating the effect of changes in light level and spectral power distribution. To support investigation of changes in spatial distribution further investigation is needed with 3D targets. Further data are also required to examine the influence of glare

    Control of attention and gaze in complex environments

    No full text
    Thesis (Ph. D.)--University of Rochester. Dept. of Brain and Cognitive Sciences, 2008.Dealing with natural, complex scenes in everyday behavior, where one is surrounded by a variety of potentially relevant stimuli, poses an important problem for our visual system. Given the constraints set by attentional and working memory limitations in acquiring and retaining information, how does our visual system solve the problem of selecting appropriate information when it’s needed, in the context of visually guided behavior? Though incomplete, overt fixations carry much information about current attentional state, and are a revealing indicator of this selection. What controls allocation of gaze and attention in natural environments? Traditionally, attention was thought to be attracted exogenously by the properties of the stimulus. Studies done using 2D experimental displays or viewing of scenes, showed that properties such as contrast or chromatic salience can explain some regularities in fixation patterns. These, however, can account for only a modest proportion of the variance. Further, experimental contexts examined may not reflect the challenges of natural visually guided behavior. Complexity of the environment and the ongoing behavior make it necessary to look at natural behavior when investigating control of gaze. Recent work in natural tasks has demonstrated that the observer’s cognitive goals play a critical role in the distribution of gaze during ongoing natural behavior. The goal of this thesis is to understand the mechanisms that control the deployment of gaze in natural environments. Though fixation patterns in natural behavior are largely determined by the momentary task, it is not clear how effective top–down control is in dynamic environments because of the difficulty of dealing with unexpected events. To address this problem we studied gaze patterns in both real and virtual walking environments where subjects were occasionally exposed to potentially colliding pedestrians. Our results indicate potential collisions do not automatically attract attention and are usually detected by active search, rather than by reacting to looming. If, however, a collider is detected, fixations on all pedestrians are increased in the subsequent few seconds, indicating that subjects learn the structure and dynamic properties of the world in order to fixate critical regions at the right time. We also investigated whether an addition of another perceptually demanding task interferes with the detection of potential collisions. In a situation of walking, while also following a leader pedestrian, detection of colliders decreased significantly indicating that subjects learn how to allocate attention and gaze to satisfy competing demands. In a real environment we investigated whether manipulation of the probability of a potential collision with pedestrians in predetermined roles, is accompanied by a corresponding change in gaze allocation. We demonstrated that fixation patterns adjust very quickly to changes in the probabilistic structure of the environment that indicate different priorities for gaze allocation. Based on our results, it appears that observers learn to represent sufficient structure about the visual environment in order to guide eye movement in a pro-active manner, in anticipation of events that are likely to occur in the scene. To investigate the importance of behavioral relevance we compared fixation durations when walkers stopped instead of going on a collision path. Other than the reduction in fixation probabilities of about 20%, the pattern remained the same. This supports the idea that gaze behavior takes into account the risk (or reward) value of particular information, and is consistent with reinforcement learning models of gaze as well as with the neuro-physiological findings on the importance of reward. Finally we made a comparison of performance in real and virtual environments in order to evaluate the validity of the latter. The results in virtual reality walking strengthen our result in the real walking experiment validating virtual environments as useful paradigms in the study of natural behavior

    The where, what and when of gaze allocation in the lab and the natural environment

    Get PDF
    How do people distribute their visual attention in the natural environment? We and our colleagues have usually addressed this question by showing pictures, photographs or videos of natural scenes under controlled conditions and recording participants? eye movements as they view them. In the present study, we investigated whether people distribute their gaze in the same way when they are immersed and moving in the world compared to when they view video clips taken from the perspective of a walker. Participants wore a mobile eye tracker while walking to buy a coffee, a trip that required a short walk outdoors through the university campus. They subsequently watched first-person videos of the walk in the lab. Our results focused on where people directed their eyes and their head, what objects were gazed at and when attention-grabbing items were selected. Eye movements were more centralised in the real world, and locations around the horizon were selected with head movements. Other pedestrians, the path, and objects in the distance were looked at often in both the lab and the real world. However, there were some subtle differences in how and when these items were selected. For example, pedestrians close to the walker were fixated more often when viewed on video than in the real world. These results provide a crucial test of the relationship between real behaviour and eye movements measured in the lab
    corecore