5 research outputs found

    A Preliminary Investigation into Eye Gaze Data in a First Person Shooter Game

    Get PDF
    This paper describes a study carried out in which the eye gaze data of several users playing a simple First Person Shooter (FPS) game has been recorded. This work shows the design and implementation of a simple game and how the execution of the game can be synchronized with an eye tracking system. The motivation behind this work is to determine the existence of visual psycho-perceptual phenomena, which may be of some use in developing appropriate information limits for distributed interactie media compression algorithms. Only 2 degrees of the 140 degrees of human vision has a high level of detail. It may be possible to determine the areas of the screen that a user is focusing on and render it in high details or pay perticular attention to its contents so as to set appropriate dead reckoning limits. Our experiment show that eye tracking may allow for improvements in rendering and new compression algorithms to be created for an online FPS game

    A Preliminary Investigation into Eye Gaze Data in a First Person Shooter Game

    Get PDF
    This paper describes a study carried out in which the eye gaze data of several users playing a simple First Person Shooter (FPS) game has been recorded. This work shows the design and implementation of a simple game and how the execution of the game can be synchronized with an eye tracking system. The motivation behind this work is to determine the existence of visual psycho-perceptual phenomena, which may be of some use in developing appropriate information limits for distributed interactie media compression algorithms. Only 2 degrees of the 140 degrees of human vision has a high level of detail. It may be possible to determine the areas of the screen that a user is focusing on and render it in high details or pay perticular attention to its contents so as to set appropriate dead reckoning limits. Our experiment show that eye tracking may allow for improvements in rendering and new compression algorithms to be created for an online FPS game

    Increasing the sense of presence in a simulation environment using image generators based on visual attention

    Get PDF
    Flight simulator systems generally use a separate image-generator component. The host is responsible for the positional data updates of the entities and the image generator is responsible for the rendering process. In such systems, the sense of presence is decreased by model flickering. This study presents a method by which the host can minimize model flickering in the image-generator output. The method is based on preexisting algorithms, such as visibility culling and level of detail management of 3D models. The flickering is minimized for the visually important entities at the expense of increasing the flickering of the entities that are out of the user's focus using a new perception-based approach. It is shown through user studies that the new proposed approach increases the participants' sense of presence. © 2011 by the Massachusetts Institute of Technology

    Automatic Addition of Physics Components to Procedural Content

    Get PDF
    While the field of procedural content generation is growing, there has been somewhat less work on developing procedural methods to animate these models. We present a technique for generating procedural models of trees and buildings via formal grammars (L-Systems and wall grammars) that are ready to be animated using physical simulation. The grammars and their interpretations are augmented to provide direct control over the physical animation, by, for example, specifying object mass and the joint stiffness. Example animations produced by our system include trees swaying in a gentle wind or being rocked by a gale, and buildings collapsing, imploding or exploding. In user testing, we had test subjects (n = 20) compare our animations with video of trees and buildings undergoing similar effects, as well as with animations in games that they have played. Results show that our animations appear physically accurate with a few minor instances of unrealistic behaviour. Users considered the animations to be more realistic than those used in current video games

    Visual Attention-Based Polygon Level of Detail Management

    Get PDF
    Modern real-time graphics systems are required to render millions of polygons to the screen per second. However, even with this high polygon rendering bandwidth, there are still applications which tax this rendering capability. We introduce in this paper a technique which adaptively allocates polygons to objects in a scene according to their visual importance. It is expected that using this technique, an improvement in the perceptual quality of a rendered image should result, for the same overall number of polygons being rendered. We present both a theoretical basis and a complete design for a visual attention-based level of detail management technique. We also present some preliminary assessment of output from the system. Applications for this technique are expected to be found in the areas of entertainment, visualisation and simulation
    corecore