2 research outputs found

    Computer Game Controlled by Eye Movements

    No full text
    Nowadays people playing games not only for fun, but also is using in terms of education, medicine and researches. For example, video games are taking important role in the development of children. Some 15 years ago every parent was trying to educate their children apart from the school by teaching themselves or buying books for them, but now technology and video games makes it much easier. By playing games children can learn languages, differentiate shapes and colors from the childhood without forcing and having a fun [1]. There is fast development on Human Computer Interface (HCI) technologies for controlling video games. Big companies find very different ways to get user input for better game interactions in earlier days. There are three popular different approaches for this purpose: (i) Analyzing data from accelerometer sensing of hardware; (ii) Speech recognition; and (iii) Analyzing optical input. But most of these approaches require specific hardware. Some of them also requires extra physical effort that users are not used to. Gamer would feel more comfortable if there would exist better tools for gaming. Eye gaze estimation is the one of the most important HCI tool that is developing against to getting input from keyboard and mouse [2-4]. For eye gaze tracking systems; accuracy, tracking frequency and hardware dependencies are important issues. Since accuracy determines the feasibility of selection of targets such as images and buttons, is used for benchmarking of the eye gaze systems with the speed of the systems [5]. There are various techniques described in literature for eye gaze estimation. Nevertheless, a vast majority of those algorithms require additional special hardware and require manual initialization of pupils. On Witzner’s algorithm [6] the iris is modeled as an ellipse, but that technique requires high quality image that is taken very close from user’s eye. Some other systems described by Noureddin [7], Park [8] require visuals from two cameras with different angles. Snake algorithm of M. Kass [9] is also used for pupil detection. But it requires improvement on its very low accuracy. Williams [10] proposed an improvement to minimize energy functional of algorithm by using greedy algorithm technique, additional to this method Choi [11] propose an improved method for segmentation, after Lai [12] used multiple snakes to increase completeness of that method but since none of versions of snakes aren’t efficient in case to detect only pupils since those method aren’t specifically aims to find circles. Hough Transform method is improved by Marcin and Ingacy [13] to detect circles. In this method, image is first converted to binary image. Then circle equation with different radiuses is applied to every point to detect circles. Our aim is to develop a computer game controlled by low cost robust eye gaze tracking system and eye gesture detection algorithm by getting visual data from user by a simple webcam. In this paper, we have proposed a game which takes inputs from a video camera, by detecting the direction where user looking to, and detecting eye gestures for click functions. We have used webcam as video input device since it is easily accessible. We have tried to use the eye movements as the HCI (human computer interaction) tool, which would be used instead of a mouse. In general, this provides much easier and fast interaction with computer for everyone including old people, children and disabled people who cannot use mouse. We have also showed that eye tracking and eye gaze estimation can be very easy and fast way to obtain HCI. References 1. Dave Moursund, Introduction to Using Games in Education: A Guide for Teachers and Parents, second edition, 2011. 2. RJK Jacob, KS Karn, Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises, in the Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research (Elsevier Science, Amsterdam, 2003), pp. 573–605. 3. S Zhai, What’s in the eyes for attentive input. Commun. ACM. 46(3), 34–39 (2003). doi:10.1145/636772.636795. 4. M Kumar, J Klingner, R Puranik, T Winograd, A Paepcke, Improving the accuracy of gaze input for interaction, in ETRA, ACM, New York, USA, pp. 65–68 (2008). 5. A Bulling, H Gellersen, Toward mobile eye-based human-computer interaction. IEEE Pervasive Comput. 9(4), 8–12 (2010). 6. Witzner D Hansen, AEC Pece, Eye tracking in the wild. Comput. Vis Image Understand. 98(1), 182–210 (2005). doi:10.1016/j.cviu.2004.07.014 7. B Noureddin, PD Lawrence, CF Man, A non-contact device for tracking gaze in a human computer interface. Comput Vis Image Understand. 98(1), 52–82 (2005). doi:10.1016/j.cviu.2004.07.005. 8. KR Park, J Chang, MC Whang, JS Lim, DW Rhee, HK Park, Y Cho, Practical gaze point detecting system, in DAGM-Symposium, 512–519 (2004). 9. M. Kass, A. Witkin, and D. Terzopoulos. Snake: Active contour models. International Journal of Computer Vision, 1:321–331, 1988. 10. Ibrahim Furkan Ince, Tae-Cheon Yang: “A New Low-Cost Eye Tracking and Blink Detection Approach: Extracting Eye Features with Blob Extraction”, Lecture Notes in Computer Science (LNCS), Springer, Vol. 5754, pp. 526-533, 2009. 11. W. P. Choi, K. M. Lam, and W. C. Siu. An adaptive active contour model for highly irregular boundaries. Pattern Recognition, 34:323–331, 2001. 12. Kok F. Lai and Roland T. Chin. A region extraction method using multiple active contour models. In Proc. of IEEE Computer Society Conference on Computer Vision and Pattern Recognition 2000, volume I, pages 64–69, 2000. 13. Donna J. Williams and Mubarak Shah. A fast algorithm for active contours and curvature estimation. CVGIP: Image Understanding, 55(1):14–26, 1992
    corecore