51 research outputs found

    Eye Gaze Tracking for Human Computer Interaction

    Get PDF
    With a growing number of computer devices around us, and the increasing time we spend for interacting with such devices, we are strongly interested in finding new interaction methods which ease the use of computers or increase interaction efficiency. Eye tracking seems to be a promising technology to achieve this goal. This thesis researches interaction methods based on eye-tracking technology. After a discussion of the limitations of the eyes regarding accuracy and speed, including a general discussion on Fitts’ law, the thesis follows three different approaches on how to utilize eye tracking for computer input. The first approach researches eye gaze as pointing device in combination with a touch sensor for multimodal input and presents a method using a touch sensitive mouse. The second approach examines people’s ability to perform gestures with the eyes for computer input and the separation of gaze gestures from natural eye movements. The third approach deals with the information inherent in the movement of the eyes and its application to assist the user. The thesis presents a usability tool for recording of interaction and gaze activity. It also describes algorithms for reading detection. All approaches present results based on user studies conducted with prototypes developed for the purpose

    Open Observing Users to Gain Insight in Lanna Mural Paintings with Responsive Website for Education

    Get PDF
    One of the major effects of Covid-19 upon the entire world from December 2019 onwards were the severe restrictions placed upon global and domestic movements. Adhering to lockdown measures, most individuals around the world were unable to travel, either within their own country or abroad. However, during this unprecedented period, the researcher addressed the issue and devised an alternative opportunity for those wishing to visit the temple Wat Phumin in Nan province, Thailand based on previous research. The outcome was funded by the Office of Contemporary Art and Culture (OCAC), Thailand in 2017 to develop Lanna mural paintings and the narratives they depicted into moving images, with the focus on the tales of the Jataka (a body of literature relating to the Buddha’s previous lives). In addition to using the findings of previous research in this study, the researcher designed for use the images of Lanna mural paintings and the moving images of the Jataka tales of Khatthana Kumara Jataka and Nimi Jataka in cooperation with responsive websites to gain a better understanding of Lanna mural paintings at Wat Phumin in Nan province through different platforms. In addition, in the context of distance education design, the use of responsive websites to facilitate the exploration and understanding of Lanna mural paintings at Wat Phumin presents a unique and innovative approach. For methodology, open observation and qualitative study were used by interviewing selected participants.This research study has determined that responsive websites are an effective alternative tool for individuals worldwide to explore and deepen their understanding of Lanna mural paintings at Wat Phumin in Nan province. The accessibility of these websites has become especially valuable during a period of limited travel and restricted mobility experienced in recent years. By utilizing responsive websites, visitors can now gain valuable insights into the rich artistic heritage of Lanna mural paintings, transcending physical limitations and geographical boundaries. This educational resource opens up new avenues for cultural exploration and promotes global appreciation of Wat Phumin's remarkable artistic treasures.Pengguna Observasi Terbuka untuk Menambah Wawasan Lukisan Mural Lanna dengan Web Responsif untuk Edukasi Abstrak Salah satu dampak besar Covid-19 terhadap seluruh dunia mulai bulan Desember 2019 dan seterusnya adalah pembatasan ketat terhadap pergerakan global dan domestik. Karena penerapan lockdown, sebagian besar orang di seluruh dunia tidak dapat melakukan perjalanan, baik di dalam negeri maupun ke luar negeri. Namun, selama periode yang belum pernah terjadi sebelumnya ini, peneliti mengatasi masalah tersebut dan merancang peluang alternatif bagi mereka yang ingin mengunjungi kuil Wat Phumin di provinsi Nan, Thailand berdasarkan penelitian sebelumnya. Hasil penelitian ini didanai oleh Kantor Seni dan Budaya Kontemporer (OCAC), Thailand pada tahun 2017 untuk mengembangkan lukisan mural Lanna dan narasi yang digambarkannya ke dalam gambar bergerak, dengan fokus pada kisah Jataka (kumpulan literatur yang berkaitan dengan kehidupan Buddha sebelumnya). Selain menggunakan temuan penelitian sebelumnya dalam penelitian ini, peneliti merancang untuk menggunakan gambar lukisan mural Lanna dan gambar bergerak kisah Jataka Khatthana Kumara Jataka dan Nimi Jataka bekerja sama dengan website responsif untuk mendapatkan pemahaman yang lebih baik tentang Lukisan mural Lanna di Wat Phumin di provinsi Nan melalui berbagai platform. Selain itu, dalam konteks perancangan pendidikan jarak jauh, pemanfaatan website responsif untuk memudahkan eksplorasi dan pemahaman lukisan mural Lanna di Wat Phumin menghadirkan pendekatan yang unik dan inovatif. Untuk metodologi, observasi terbuka dan studi kualitatif digunakan dengan mewawancarai partisipan terpilih. Studi penelitian ini menentukan bahwa situs web responsif adalah alat alternatif yang efektif bagi individu di seluruh dunia untuk mengeksplorasi dan memperdalam pemahaman mereka tentang lukisan mural Lanna di Wat Phumin di provinsi Nan. Aksesibilitas situs web ini menjadi sangat berharga selama periode terbatasnya perjalanan dan terbatasnya mobilitas yang dialami dalam beberapa tahun terakhir. Dengan memanfaatkan situs web responsif, pengunjung kini dapat memperoleh wawasan berharga tentang kekayaan warisan seni lukisan mural Lanna, yang melampaui keterbatasan fisik dan batas geografis. Sumber daya pendidikan ini membuka jalan baru untuk eksplorasi budaya dan mempromosikan apresiasi global terhadap kekayaan seni Wat Phumin yang luar biasa

    VOICE RECOGNIZATION FOR MOUSE CONTROL USING HCI

    Get PDF
    One  of  the  most  important  research  areas  in  the  field  of  Human -Computer-Interaction (HCI) is gesture  recognition as it provides a natural and intuitive  way  to  communicate  between  people  and  machines.  Voice-based HCI applications range from computer applications to virtual/augmented reality and is recently being explored in other fields. This work proposes the implementation of absolute virtual mouse based on the interpretation of voice reorganization control. The procedure is to control the mouse pointer as for the mouse movement to up/down/left/right, open the file, dragging the file. This  virtual  device  is designed  specifically  as  an  alternative non-contact  pointer  for  people  with mobility impairments in the upper extremities. The implementation of the virtual mouse by voice control is to make HCI simplification for disabled persons especially for the person who are not having the hands and arms, and Alternative mouse cursor positioning system for laptops

    The Effects of Eye Gaze Based Control on Operator Performance in Monitoring Multiple Displays

    Get PDF
    This study investigated the utility and efficacy of using eye tracking technology as a method for selecting control of a camera within a multiple display configuration. A task analysis with a Keystroke-Level-Model (KLM) was conducted to acquire an estimated time for switching between cameras. KLM estimates suggest that response times are faster using an eye tracker than manual control -indicating a time savings. To confirm these estimates, and test other hypotheses a 2 × 2 within-subjects factorial design was used to examine the effects of Control (Using an eye tracker, or manual) under different Task Loads (Low, High). Dependent variables included objective performance (accuracy and response times during an identification task) and subjective workload measured by the NASA-TLX. The eye tracker under the specific experimental conditions was not significantly better or worse, however, further research may support that the use of the eye tracker could surpass the use of manual method in terms of operator performance given the time saving data from our initial task analysis using a Keystroke Level Model (KLM). Overall, this study provided great insight into using an eye tracker in a multiple display monitoring system

    Performance, Characteristics, and Error Rates of Cursor Control Devices for Aircraft Cockpit Interaction

    Get PDF
    This document is the Accepted Manuscript version of the following article: Peter R. Thomas, 'Performance, Characteristics, and Error Rates of Cursor Control Devices for Aircraft Cockpit Interaction', International Journal of Human-Computer Studies, Vol. 109: 41-53, available online 31 August 2017. Under embargo. Embargo end date: 31 August 2018. Published by Elsevier. © 2017 Elsevier Ltd. All rights reserved.This paper provides a comparative performance analysis of a hands-on-throttle-and-stick (HOTAS) cursor control device (CCD) with other suitable CCDs for an aircraft cockpit: an isotonic thumbstick, a trackpad, a trackball, and touchscreen input. The performance and characteristics of these five CCDs were investigated in terms of throughput, movement accuracy, and error rate using the ISO 9241-9 standard task. Results show statistically significant differences (p < 0.001) between three groupings of the devices, with the HOTAS having the lowest throughput (0.7 bits/s) and the touchscreen the highest (3.7 bits/s). Errors for all devices were shown to increase with decreasing target size (p < 0.001) and, to a lesser effect, increasing target distance (p < 0.01). The trackpad was found to be the most accurate of the five devices, being significantly better than the HOTAS fingerstick and touchscreen (p < 0.05) with the touchscreen performing poorly on selecting smaller targets (p < 0.05). These results would be useful to cockpit human-machine interface designers and provides evidence of the need to move away from, or significantly augment the capabilities of, this type of HOTAS CCD in order to improve pilot task throughput in increasingly data-rich cockpits.Peer reviewedFinal Accepted Versio

    The Application of a System of Eye Tracking in Laparoscopic Surgery: A New Didactic Tool to Visual Instructions

    Get PDF
    Introduction: Laparoscopic surgery is an increasingly used technique, but it requires a high degree of learning, and communication between the operating room crew is considerably difficult. The use of eye tracking has been proposed as a didactic and evaluation tool in several settings, including in laparoscopy in simulators.Objectives: This study aimed to evaluate the usefulness of the use of eye tracking systems (Tobii glasses 2) in laparoscopic surgery as a didactic and assessment tool to improve communication in the operating room and improve patients' security.Methodology: An anonymous survey was sent to the students and medical teachers of a faculty of medicine and practicing doctors and residents. The message contained an explanation about the use of the Tobii glasses, a link to watch the video showing its use in a laparoscopic surgery, and the survey to complete after watching the video.Results: The survey was answered by 113 participants (51.3% medical students, 27.4% medical teachers, 18.6% practicing doctors, and 2.7% medicine residents). Eighty-three percent agreed with the usefulness of the “Tobii glasses” in the operating room for improving communication between the main surgeon and the assistant, for learning complex surgery techniques, for obtaining didactic videos, and for indicating anatomical structures. The item scored worst was the price of the glasses.Conclusions: It is possible to record and project expert gaze patterns in the operating room in real time using the Tobii glasses. This device allows improving communication among the surgical crew and the learning of residents and also improving the security of surgical patients

    Eye Tracking in User Interfaces

    Get PDF
    Tato diplomová práce byla vytvořena během studijního pobytu na Uviversity of Estern Finland, Joensuu, Finsko. Tato diplomová práce se zabývá využitím technologie sledování pohledu neboli také sledování pohybu očí (Eye-Tracking) pro interakci člověk-počítač (Human-Computer Interaction (HCI)). Navržený a realizovaný systém mapuje pozici bodu pohledu/zájmu (the point of gaze), která odpovídá souřadnicím v souřadnicovém systému kamery scény do souřadnicového systému displeje. Zároveň tento systém kompenzuje pohyby uživatele a tím odstraňuje jeden z hlavních problémů využití sledování pohledu v HCI. Toho je dosaženo díky stanovení transformace mezi projektivním prostorem scény a projektivním prostorem displeje. Za použití význačných bodů (interesting points), které jsou nalezeny a popsány pomocí metody SURF, vyhledání a spárování korespondujících bodů a vypočítání homografie. Systém byl testován s využitím testovacích bodů, které byly rozložené po celé ploše displeje.This MSc Thesis was performed during a study stay at the University of Eastern Finland, Joensuu, Finland. This thesis presents the utilization of Eye-Tracking technology in Human-Computer Interaction (HCI). The proposed and implemented system is able to map co-ordinates in the plane of a scene camera, which correspond with co-ordinates of the point of gaze, into co-ordinates in the plane of a display device. In addition, the system compensates user's motions and thus removes one of main problems of use of Eye-Tracking in HCI. This is achieved by determination of a transformation between the projective space of scene and the projective space of display. Method is based on detection and description of interesting points by using SURF, matching of corresponding points and calculating of homography. The system has been tested by using testing points, which are spread over the display area.

    Design, Implementation, and Performance Study of an Open Source Eye-Control System to Pilot a Parrot AR.Drone Quadrocopter

    Full text link
    Natural user interface is a fairly new concept in the field of human-computer interaction. It is the idea of using every day natural human behaviors and actions to control a device. An example of a natural user interface is touch control technology in smartphones, tablets, and new laptops. The interaction is more direct when compared to artificial input devices like a keyboard and mouse. Though natural user interface devices might not perform as well as standard input devices for certain applications, for other applications they are now the de facto standard. A new user interface that is poised to be the next natural user interface in human-computer interaction is eye-control, or the ability to control an interface with just the user’s eyes using technology that has been around for a long time called eye trackers. The problem for much of the existence of eye trackers is the cost. Most modern commercial eye trackers cost anywhere between 10,000and10,000 and 40,000, and that is too expensive for regular consumers to buy and use. In this paper, we build a low cost system for eye-control using an open source program called ITU Gaze Tracker. In the process, we developed an interface which allows a user to pilot a Parrot AR.Drone quadrocopter using just their gaze. In this explorative study, we explore the performance of this eye-control system to keyboard control in the operation of an AR.Drone around an obstacle course. We collected certain performance metrics like lap completion time

    Semi-automated Usability Analysis through Eye Tracking

    Get PDF
    Usability of software is a crucial aspect of successful applications and could give one application a competitive edge over another. Eye tracking is a popular approach to usability evaluation, but is time consuming and requires expert analysis. This paper proposes a semi-automated process for identifying usability problems in applications with a task-based focus, such as business applications, without the need for expert analysis. The approach is demonstrated on the eye tracking data from a mobile procurement application involving 33 participants. With the recent inclusion of built-in eye tracking hardware in mobile devices, the proposed approach introduces the possibility of conducting remote, large-scale usability studies for improving user experience in mobile applications

    Pocket transfers : Interaction techniques for transferring content from situated displays to mobile devices

    Get PDF
    We present Pocket Transfers: interaction techniques that allow users to transfer content from situated displays to a personal mobile device while keeping the device in a pocket or bag. Existing content transfer solutions require direct manipulation of the mobile device, making inter-action slower and less flexible. Our introduced tech-niques employ touch, mid-air gestures, gaze, and a mul-timodal combination of gaze and mid-air gestures. We evaluated the techniques in a novel user study (N=20), where we considered dynamic scenarios where the user approaches the display, completes the task, and leaves. We show that all pocket transfer techniques are fast and seen as highly convenient. Mid-air gestures are the most efficient touchless method for transferring a single item, while the multimodal method is the fastest touchless method when multiple items are transferred. We provide guidelines to help researchers and practitioners choose the most suitable content transfer techniques for their systems
    corecore