95,538 research outputs found

    Eye Gaze Tracking for Human Computer Interaction

    Get PDF
    With a growing number of computer devices around us, and the increasing time we spend for interacting with such devices, we are strongly interested in finding new interaction methods which ease the use of computers or increase interaction efficiency. Eye tracking seems to be a promising technology to achieve this goal. This thesis researches interaction methods based on eye-tracking technology. After a discussion of the limitations of the eyes regarding accuracy and speed, including a general discussion on Fitts’ law, the thesis follows three different approaches on how to utilize eye tracking for computer input. The first approach researches eye gaze as pointing device in combination with a touch sensor for multimodal input and presents a method using a touch sensitive mouse. The second approach examines people’s ability to perform gestures with the eyes for computer input and the separation of gaze gestures from natural eye movements. The third approach deals with the information inherent in the movement of the eyes and its application to assist the user. The thesis presents a usability tool for recording of interaction and gaze activity. It also describes algorithms for reading detection. All approaches present results based on user studies conducted with prototypes developed for the purpose

    To Study Effects of Using Human Presenter in Product Image: Applying an Eye-tracker VS Facial Expression Translation

    Get PDF
    Eye tracking is the process of measuring either the point of gaze or the motion of an eye relative to the head. An eye tracker is a device for measuring eye positions and eye movement. Eye trackers are used in research on the visual system, in psychology, in psycholinguistics, marketing, as an input device for human-computer interaction, and in product design. Previous study applies an eye-tracker to investigate effects of using human presenter in product images and conclude that eye-tracker data can be used for eye-gaze data collection and analyzed for further statistical conclusion [8]. The result indicates that product image with positive emotion female presenter gets the highest fixation duration, however, not significantly higher than fixation duration of other types of product images. However, Eye tracking by professional eye-tracker is not an affordable research method for most researches. Facial expression translation is a new function comes from “Youdao translate officer” which can be downloaded from apple APP store for free; It can indicate human facial expression in eight dimensions (i.e., happiness, angry, fear, contempt, disgust, calm, surprise, sad) with values. We are proposed to use this free technical to investigate effects of using human present in product images and compare the results with studies applies eye-tracker previously. A fresh accepted research method could be discovered by this study, and give an optional research mothed in relative field

    Eye movements in real and simulated driving and navigation control - Foreword to the Special Issue

    Get PDF
    The control of technological systems by human operators has been the object of study for many decades. The increasing complexity in the digital age has made the optimization of the interaction between system and human operator particularly necessary.. In the present thematic issue, ten exemplary articles are presented, ranging from observational field studies to experimental work in highly complex navigation simulators. For the human operator, the processes of attention play a crucial role, which are captured in the contributions listed in this thematic issue by eye-tracking devices. For many decades, eye tracking during car driving has been investigated extensively (e.g. Lappi & Lehtonen, 2013; Grüner & Ansorge, 2017). In the present special issue, Cvahte Ojsteršek & Topolšek (2019) provide a literature review and scientometric analysis of 139 eye-tracking studies investigating driver distraction. For future studies, the authors recommend a wider variety of distractor stimuli, a larger number of tested participants, and an increasing interdisciplinarity of researchers. In addition to most studies investigating bottom-up processes of covered attention, Tuhkanen, Pekkanen, Lehtonen & Lappi (2019) include the experimental control of top-down processes of overt attention in an active visuomotor steering task. The results indicate a bottom-up process of biasing the optic flow of the stimulus input in interaction with the top-down saccade planning induced by the steering task. An expanding area of technological development involves autonomous driving where actions of the human operator directly interact with the programmed reactions of the vehicle. Autonomous driving requires, however,a broader exploration of the entire visual input and less gaze directed towards the road centre. Schnebelen, Charron & Mars (2021) conducted experimental research in this area and concluded that gaze dynamics played the most important role in distinguishing between manual and automated driving. Through a combination of advanced gaze tracking systems with the latest vehicle environment sensors, Bickerdt, Wendland, Geisler, Sonnenberg & Kasneci (2021) conducted a study with 50 participants in a driving simulator and propose a novel way to determine perceptual limits which are applicable to realistic driving scenarios. Eye-Computer-Interaction (ECI) is an interactive method of directly controlling a technological device by means of ocular parameters. In this context, Niu, Gao, Xue, Zhang & Yang (2020) conducted two experiments to explore the optimum target size and gaze-triggering dwell time in ECI. Their results have an exemplary application value for future interface design. Aircraft training and pilot selection is commonly performed on simulators. This makes it possible to study human capabilities and their limitation in interaction with the simulated technological system. Based on their methodological developments and experimental results, Vlačić, Knežević, Mandal, Rođenkov & Vitsas (2020) propose a network approach with three target measures describing the individual saccade strategy of the participants in this study. In their analysis of the cognitive load of pilots, Babu, Jeevitha Shree, Prabhakar, Saluja, Pashilkar & Biswas (2019) investigated the ocular parameters of 14 pilots in a simulator and during test flights in an aircraft during air to ground attack training. Their results showed that ocular parameters are significantly different in different flying conditions and significantly correlate with altitude gradients during air to ground dive training tasks. In maritime training the use of simulations is per international regulations mandatory. Mao, Hildre & Zhang (2019) performed a study of crane lifting and compared novice and expert operators. Similarities and dissimilarities of eye behavior between novice and expert are outlined and discussed. The study of Atik & Arslan (2019) involves capturing and analyzing eye movement data of ship officers with sea experience in simulation exercises for assessing competency. Significant differences were found between electronic navigation competencies of expert and novice ship officers. The authors demonstrate that the eye tracking technology is a valuable tool for the assessment of electronic navigation competency. The focus of the study by Atik (2020) is the assessment and training of situational awareness of ship officers in naval Bridge Resource Management. The study shows that eye tracking provides the assessor with important novel data in simulator based maritime training, such as focus of attention, which is a decisive factor for the effectiveness of Bridge Resource Management training. The research presented in the different articles of this special thematic issue cover many different areas of application and involve specialists from different fields, but they converge on repeated demonstrations of the usefulness of measuring attentional processes by eye movements or using gaze parameters for controlling complex technological devices. Together, they share the common goal of improving the potential and safety of technology in the digital age by fitting it to human capabilities and limitations. References Atik, O. (2020). Eye tracking for assessment of situational awareness in bridge resource management training. Journal of Eye Movement Research, 12(3). https://doi.org/10.16910/jemr.12.3.7 Atik, O., & Arslan, O. (2019). Use of eye tracking for assessment of electronic navigation competency in maritime training. Journal of Eye Movement Research, 12(3). https://doi.org/10.16910/jemr.12.3.2 Babu, M. D., JeevithaShree, D. V., Prabhakar, G., Saluja, K. P. S., Pashilkar, A., & Biswas, P. (2019). Estimating pilots’ cognitive load from ocular parameters through simulation and in-flight studies. Journal of Eye Movement Research, 12(3). https://doi.org/10.16910/jemr.12.3.3 Cvahte Ojsteršek, T., & Topolšek, D. (2019). Eye tracking use in researching driver distraction: A scientometric and qualitative literature review approach. Journal of Eye Movement Research, 12(3). https://doi.org/10.16910/jemr.12.3.5 Grüner, M., & Ansorge, U. (2017). Mobile eye tracking during real-world night driving: A selective review of findings and recommendations for future research. Journal of Eye Movement Research, 10(2). https://doi.org/10.16910/jemr.10.2.1 Lappi, O., & Lehtonen, E. (2013). Eye-movements in real curve driving: pursuit-like optokinesis in vehicle frame of reference, stability in an allocentric reference coordinate system. Journal of Eye Movement Research, 6(1). https://doi.org/10.16910/jemr.6.1.4 Mao, R., Li, G., Hildre, H. P., & Zhang, H. (2019). Analysis and evaluation of eye behavior for marine operation training - A pilot study. Journal of Eye Movement Research, 12(3). https://doi.org/10.16910/jemr.12.3.6 Niu, Y.- feng, Gao, Y., Xue, C.- qi, Zhang, Y.- ting, & Yang, L.- xin. (2020). Improving eye–computer interaction interface design: Ergonomic investigations of the optimum target size and gaze-triggering dwell time. Journal of Eye Movement Research, 12(3). https://doi.org/10.16910/jemr.12.3.8 Schnebelen, D., Charron, C., & Mars, F. (2021). Model-based estimation of the state of vehicle automation as derived from the driver’s spontaneous visual strategies. Journal of Eye Movement Research, 12(3). https://doi.org/10.16910/jemr.12.3.10 Tuhkanen, S., Pekkanen, J., Lehtonen, E., & Lappi, O. (2019). Effects of an active visuomotor steering task on covert attention. Journal of Eye Movement Research, 12(3). https://doi.org/10.16910/Jemr.12.3.1 Vlačić, S. I., Knežević, A. Z., Mandal, S., Rođenkov, S., & Vitsas, P. (2020). Improving the pilot selection process by using eye-tracking tools. Journal of Eye Movement Research, 12(3). https://doi.org/10.16910/jemr.12.3.

    GazeDrone: Mobile Eye-Based Interaction in Public Space Without Augmenting the User

    Get PDF
    Gaze interaction holds a lot of promise for seamless human-computer interaction. At the same time, current wearable mobile eye trackers require user augmentation that negatively impacts natural user behavior while remote trackers require users to position themselves within a confined tracking range. We present GazeDrone, the first system that combines a camera-equipped aerial drone with a computational method to detect sidelong glances for spontaneous (calibration-free) gaze-based interaction with surrounding pervasive systems (e.g., public displays). GazeDrone does not require augmenting each user with on-body sensors and allows interaction from arbitrary positions, even while moving. We demonstrate that drone-supported gaze interaction is feasible and accurate for certain movement types. It is well-perceived by users, in particular while interacting from a fixed position as well as while moving orthogonally or diagonally to a display. We present design implications and discuss opportunities and challenges for drone-supported gaze interaction in public
    • …
    corecore