5 research outputs found

    Towards a Driving Training System to Support Cognitive Flexibility

    Get PDF
    Driving under unfamiliar conditions, such as unfamiliar traffic system and unfamiliar vehicle configuration during overseas holidays, might cause fatality, injury or property damage. In these cases, a driver needs to apply their prior knowledge to a new driving situation in order to drive safely. This ability is called cognitive flexibility. Prior research has found that left/mixed-handed people show superior cognitive flexibility in tasks required such ability than right-handed people. This paper aims to explore the relationships among cognitive flexibility, handedness and the types of errors drivers make, specifically at roundabouts and intersections in an unfamiliar driving condition. We conducted an experiment using a right-hand driving simulator and a left-hand simulated traffic scenario as a driving condition to collect the related data to driving at roundabout and intersection. All participants were not familiar with that condition. We found that left/mixed-handed drivers show a significantly superior cognitive flexibility at a turn-left roundabout and intersection. Also left/mixed handed drivers make a significantly fewer number of errors than right-handed drivers when entering the roundabout and approaching the intersection

    Designing a user-defined gesture vocabulary for an in-vehicle climate control system

    No full text
    Hand gestures are a suitable interface medium for in-vehicle interfaces. They are intuitive and natural to perform, and less visually demanding while driving. This paper aims at analysing human gestures to define a preliminary gesture vocabulary for in-vehicle climate control using a driving simulator. We conducted a user-elicitation experiment on 22 participants performing two driving scenarios with different levels of cognitive load. The participants were filmed while performing natural gestures for manipulating the air-conditioning inside the vehicle. Comparisons are drawn between the proposed approach to define a vocabulary using 9 new gestures (GestDrive) and previously suggested methods. The outcomes demonstrate that GestDrive is successful in describing the employed gestures in detail.5 page(s

    Using a Bayesian Framework to Develop 3D Gestural Input Systems Based on Expertise and Exposure in Anesthesia

    Get PDF
    Interactions with a keyboard and mouse fall short of human capabilities and what is lacking in the technological revolution is a surge of new and natural ways of interacting with computers. In-air gestures are a promising input modality as they are expressive, easy to use, quick to use, and natural for users. It is known that gestural systems should be developed within a particular context as gesture choice is dependent on the context; however, there is little research investigating other individual factors which may influence gesture choice such as expertise and exposure. Anesthesia providers’ hands have been linked to bacterial transmission; therefore, this research investigates the context of gestural technology for anesthetic task. The objective of this research is to understand how expertise and exposure influence gestural behavior and to develop Bayesian statistical models that can accurately predict how users would choose intuitive gestures in anesthesia based on expertise and exposure. Expertise and exposure may influence gesture responses for individuals; however, there is limited to no work investigating how these factors influence intuitive gesture choice and how to use this information to predict intuitive gestures to be used in system design. If researchers can capture users’ gesture variability within a particular context based on expertise and exposure, then statistical models can be developed to predict how users may gesturally respond to a computer system and use those predictions to design a gestural system which anticipates a user’s response and thus affords intuitiveness to multiple user groups. This allows designers to more completely understand the end user and implement intuitive gesture systems that are based on expected natural responses. Ultimately, this dissertation seeks to investigate the human factors challenges associated with gestural system development within a specific context and to offer statistical approaches to understanding and predicting human behavior in a gestural system. Two experimental studies and two Bayesian analyses were completed in this dissertation. The first experimental study investigated the effect of expertise within the context of anesthesiology. The main finding of this study was that domain expertise is influential when developing 3D gestural systems as novices and experts differ in terms of intuitive gesture-function mappings as well as reaction times to generate an intuitive mapping. The second study investigated the effect of exposure for controlling a computer-based presentation and found that there is a learning effect of gestural control in that participants were significantly faster at generating intuitive mappings as they gained exposure with the system. The two Bayesian analyses were in the form of Bayesian multinomial logistic regression models where intuitive gesture choice was predicted based on the contextual task and either expertise or exposure. The Bayesian analyses generated posterior predictive probabilities for all combinations of task, expertise level, and exposure level and showed that gesture choice can be predicted to some degree. This work provides further insights into how 3D gestural input systems should be designed and how Bayesian statistics can be used to model human behavior
    corecore