3 research outputs found

    Tool for the Analysis of Human Interaction with Two-Dimensional Printed Imagery

    Get PDF
    The study of human vision can include our interaction with objects. These studies include behavior modeling, understanding visual attention and motor guidance, and enhancing user experiences. But all these studies have one thing in common. To analyze the data in detail, researchers typically have to analyze video data frame by frame. Real world interaction data often comprises of data from both eye and hand. Analyzing such data frame by frame can get very tedious and time-consuming. A calibrated scene video from an eye-tracker captured at 120 Hz for 3 minutes has over 21,000 frames to be analyzed. Automating the process is crucial to allow interaction research to proceed. Research in object recognition over the last decade now allows eye-movement data to be analyzed automatically to determine what a subject is looking at and for how long. I will describe my research in which I developed a pipeline to help researchers analyze interaction data including eye and hand. Inspired by a semi-automated pipeline for analyzing eye tracking data, I have created a pipeline for analyzing hand grasp along with gaze. Putting both pipelines together can help researchers analyze interaction data. The hand-grasp pipeline detects skin to locate the hands, then determines what object (if any) the hand is over, and where the thumbs/fingers occluded that object. I also compare identification with recognition throughout the pipeline. The current pipeline operates on independent frames; future work will extend the pipeline to take advantage of the dynamics of natural interactions

    Efficient analysis of gaze-behavior in 3D environments

    Get PDF
    Pfeiffer T, Renner P, Pfeiffer-Leßmann N. Efficient analysis of gaze-behavior in 3D environments. Cognitive Processing. 2014;15(Suppl. 1):S127-S129.We present an approach to identify the 3D point of regard and the fixated object in real-time based on 2D gaze videos without the need for manual annotation. The approach does not require additional hardware except for the mobile eye tracker. It is currently applicable for scenarios with static target objects and requires an instrumentation of the environment with markers. The system has already been tested in two different studies. Possible applications are visual world paradigms in complex 3D environments, research on visual attention or human-human/human-agent interaction studies

    Latvijas Universitātes 2014.gada publiskais pārskats

    Get PDF
    Šajā pārskatā ir apvienoti: 1. Gada publiskais pārskats (LR Likuma par budžetu un finanšu vadību 14.panta 3.punkts; 05.05.2010. Ministru kabineta noteikumi Nr.413 „Noteikumi par gada publiskajiem pārskatiem”); 2. Zinātniskās institūcijas gada publiskais pārskats (LR Zinātniskās darbības likuma 40.pants; 16.05.2006. Ministru kabineta noteikumi Nr.397 „Noteikumi par zinātnisko institūciju reģistrā reģistrētā zinātniskā institūta gada publisko pārskatu”); 3. Gadagrāmata (LR Augstskolu likuma 75.pants). Pārskats ir sagatavots, izmantojot datus no LU fakultāšu, zinātnisko institūtu, studiju centru un LU aģentūru iesniegtajiem pārskatiem, LUIS datus, kā arī LU administratīvo un patstāvīgo akadēmisko struktūrvienību sagatavotos materiālus
    corecore