36 research outputs found

    Construction and Evaluation of an Ultra Low Latency Frameless Renderer for VR.

    Get PDF
    © 2016 IEEE.Latency-the delay between a users action and the response to this action-is known to be detrimental to virtual reality. Latency is typically considered to be a discrete value characterising a delay, constant in time and space-but this characterisation is incomplete. Latency changes across the display during scan-out, and how it does so is dependent on the rendering approach used. In this study, we present an ultra-low latency real-time ray-casting renderer for virtual reality, implemented on an FPGA. Our renderer has a latency of 1 ms from tracker to pixel. Its frameless nature means that the region of the display with the lowest latency immediately follows the scan-beam. This is in contrast to frame-based systems such as those using typical GPUs, for which the latency increases as scan-out proceeds. Using a series of high and low speed videos of our system in use, we confirm its latency of 1 ms. We examine how the renderer performs when driving a traditional sequential scan-out display on a readily available HMO, the Oculus Rift OK2. We contrast this with an equivalent apparatus built using a GPU. Using captured human head motion and a set of image quality measures, we assess the ability of these systems to faithfully recreate the stimuli of an ideal virtual reality system-one with a zero latency tracker, renderer and display running at 1 kHz. Finally, we examine the results of these quality measures, and how each rendering approach is affected by velocity of movement and display persistence. We find that our system, with a lower average latency, can more faithfully draw what the ideal virtual reality system would. Further, we find that with low display persistence, the sensitivity to velocity of both systems is lowered, but that it is much lower for ours

    Characterizing the Effects of Local Latency on Aim Performance in First Person Shooters

    Get PDF
    Real-time games such as first-person shooters (FPS) are sensitive to even small amounts of lag. The effects of network latency have been studied, but less is known about local latency -- that is, the lag caused by local sources such as input devices, displays, and the application. While local latency is important to gamers, we do not know how it affects aiming performance and whether we can reduce its negative effects. To explore these issues, we tested local latency in a variety of real-world gaming systems and carried out a controlled study focusing on targeting and tracking activities in an FPS game with varying degrees of local latency. In addition, we tested the ability of a lag compensation technique (based on aim assistance) to mitigate the negative effects. To motivate the need for these studies, we also examined how aim in FPS differs from pointing in standard 2D tasks, showing significant differences in performance metrics. Our studies found local latencies in the real-world range from 23 to 243~ms that cause significant and substantial degradation in performance (even for latencies as low as 41~ms). The studies also showed that our compensation technique worked well, reducing the problems caused by lag in the case of targeting, and removing the problem altogether in the case of tracking. Our work shows that local latency is a real and substantial problem -- but game developers can mitigate the problem with appropriate compensation methods

    Low Latency Displays for Augmented Reality

    Get PDF
    The primary goal for Augmented Reality (AR) is bringing the real and virtual together into a common space. Maintaining this illusion, however, requires preserving spatially and temporally consistent registration despite changes in user or object pose. The greatest source of registration error is latency—the delay between when something moves and the display changes in response—which breaks temporal consistency. Furthermore, the real world varies greatly in brightness; ranging from bright sunlight to deep shadows. Thus, a compelling AR system must also support High-Dynamic Range (HDR) to maintain its virtual objects’ appearance both spatially and temporally consistent with the real world. This dissertation presents new methods, implementations, results (both visual and performance), and future steps for low latency displays, primarily in the context of Optical See-through Augmented Reality (OST-AR) Head-Mounted Displays, focusing on temporal consistency in registration, HDR color support, and spatial and temporal consistency in brightness: 1. For registration temporal consistency, the primary insight is breaking the conventional display paradigm: computers render imagery, frame by frame, and then transmit it to the display for emission. Instead, the display must also contribute towards rendering by performing a post-rendering, post-transmission warp of the computer-supplied imagery in the display hardware. By compensating in the display for system latency by using the latest tracking information, much of the latency can be short-circuited. Furthermore, the low latency display must support ultra-high frequency (multiple kHz) refreshing to minimize pose displacement between updates. 2. For HDR color support, the primary insight is developing new display modulation techniques. DMDs, a type of ultra-high frequency display, emit binary output, which require modulation to produce multiple brightness levels. Conventional modulation breaks low latency guarantees, and modulation of bright LEDs illuminators at frequencies to support kHz-order HDR exceeds their capabilities. Thus one must directly synthesize the necessary variation in brightness. 3. For spatial and temporal brightness consistency, the primary insight is integrating HDR light sensors into the display hardware: the same processes which both compensate for latency and generate HDR output can also modify it in response to the spatially sensed brightness of the real world.Doctor of Philosoph

    Challenges in passenger use of mixed reality headsets in cars and other transportation

    Get PDF
    This paper examines key challenges in supporting passenger use of augmented and virtual reality headsets in transit. These headsets will allow passengers to break free from the restraints of physical displays placed in constrained environments such as cars, trains and planes. Moreover, they have the potential to allow passengers to make better use of their time by making travel more productive and enjoyable, supporting both privacy and immersion. However, there are significant barriers to headset usage by passengers in transit contexts. These barriers range from impediments that would entirely prevent safe usage and function (e.g. motion sickness) to those that might impair their adoption (e.g. social acceptability). We identify the key challenges that need to be overcome and discuss the necessary resolutions and research required to facilitate adoption and realize the potential advantages of using mixed reality headsets in transit

    Reducing the effect of network delay on tightly-coupled interaction

    Get PDF
    Tightly-coupled interaction is shared work in which each person’s actions immediately and continuously influence the actions of others. Tightly-coupled interaction is a hallmark of expert behaviour in face-to-face activity, but becomes extremely difficult to accomplish in distributed groupware. The main cause of this difficulty is network delay – even amounts as small as 100ms – that disrupts people’s ability to synchronize their actions with another person. To reduce the effects of delay on tightly-coupled interaction, I introduce a new technique called Feedback-Feedthrough Synchronization (FFS). FFS causes visual feedback from an action to occur at approximately the same time for both the local and the remote person, preventing one person from getting ahead of the other in the coordinated interaction. I tested the effects of FFS on group performance in several delay conditions, and my study showed that FFS substantially improved users’ performance: accuracy was significantly improved at all levels of delay, and without noticeable increase in perceived effort or frustration. Techniques like FFS that support the requirements of tightly-coupled interaction provide new means for improving the usability of groupware that operates on real-world networks

    Dead Reckoning Using Play Patterns in a Simple 2D Multiplayer Online Game

    Get PDF
    In today’s gaming world, a player expects the same play experience whether playing on a local network or online with many geographically distant players on congested networks. Because of delay and loss, there may be discrepancies in the simulated environment from player to player, likely resulting in incorrect perception of events. It is desirable to develop methods that minimize this problem. Dead reckoning is one such method. Traditional dead reckoning schemes typically predict a player’s position linearly by assuming players move with constant force or velocity. In this paper, we consider team-based 2D online action games. In such games, player movement is rarely linear. Consequently, we implemented such a game to act as a test harness we used to collect a large amount of data from playing sessions involving a large number of experienced players. From analyzing this data, we identified play patterns, which we used to create three dead reckoning algorithms. We then used an extensive set of simulations to compare our algorithms with the IEEE standard dead reckoning algorithm and with the recent “Interest Scheme” algorithm. Our results are promising especially with respect to the average export error and the number of hits

    Analyzing the Impact of Spatio-Temporal Sensor Resolution on Player Experience in Augmented Reality Games

    Get PDF
    Along with automating everyday tasks of human life, smartphones have become one of the most popular devices to play video games on due to their interactivity. Smartphones are embedded with various sensors which enhance their ability to adopt new new interaction techniques for video games. These integrated sen- sors, such as motion sensors or location sensors, make the device able to adopt new interaction techniques that enhance usability. However, despite their mobility and embedded sensor capacity, smartphones are limited in processing power and display area compared to desktop computer consoles. When it comes to evaluat- ing Player Experience (PX), players might not have as compelling an experience because the rich graphics environments that a desktop computer can provide are absent on a smartphone. A plausible alternative in this regard can be substituting the virtual game world with a real world game board, perceived through the device camera by rendering the digital artifacts over the camera view. This technology is widely known as Augmented Reality (AR). Smartphone sensors (e.g. GPS, accelerometer, gyro-meter, compass) have enhanced the capability for deploying Augmented Reality technology. AR has been applied to a large number of smartphone games including shooters, casual games, or puzzles. Because AR play environments are viewed through the camera, rendering the digital artifacts consistently and accurately is crucial because the digital characters need to move with respect to sensed orientation, then the accelerometer and gyroscope need to provide su ciently accurate and precise readings to make the game playable. In particular, determining the pose of the camera in space is vital as the appropriate angle to view the rendered digital characters are determined by the pose of the camera. This defines how well the players will be able interact with the digital game characters. Depending in the Quality of Service (QoS) of these sensors, the Player Experience (PX) may vary as the rendering of digital characters are affected by noisy sensors causing a loss of registration. Confronting such problem while developing AR games is di cult in general as it requires creating wide variety of game types, narratives, input modalities as well as user-testing. Moreover, current AR games developers do not have any specific guidelines for developing AR games, and concrete guidelines outlining the tradeoffs between QoS and PX for different genres and interaction techniques are required. My dissertation provides a complete view (a taxonomy) of the spatio-temporal sensor resolution depen- dency of the existing AR games. Four user experiments have been conducted and one experiment is proposed to validate the taxonomy and demonstrate the differential impact of sensor noise on gameplay of different genres of AR games in different aspect of PX. This analysis is performed in the context of a novel instru- mentation technology, which allows the controlled manipulation of QoS on position and orientation sensors. The experimental outcome demonstrated how the QoS of input sensor noise impacts the PX differently while playing AR game of different genre and the key elements creating this differential impact are - the input modality, narrative and game mechanics. Later, concrete guidelines are derived to regulate the sensor QoS as complete set of instructions to develop different genres or AR games

    Visual Perception in Simulated Reality

    Get PDF

    Consistency Algorithms and Protocols for Distributed Interactive Applications

    Full text link
    The Internet has a major impact not only on how people retrieve information but also on how they communicate. Distributed interactive applications support the communication and collaboration of people through the sharing and manipulation of rich multimedia content via the Internet. Aside from shared text editors, meeting support systems, and distributed virtual environments, shared whiteboards are a prominent example of distributed interactive applications. They allow the presentation and joint editing of documents in video conferencing scenarios. The design of such a shared whiteboard application, the multimedia lecture board (mlb), is a main contribution of this thesis. Like many other distributed interactive applications, the mlb has a replicated architecture where each user runs an instance of the application. This has the distinct advantage that the application can be deployed in a lightweight fashion, without relying on a supporting server infrastructure. But at the same time, this peer-to-peer architecture raises a number of challenging problems: First, application data needs to be distributed among all instances. For this purpose, we present the network protocol RTP/I for the standardized communication of distributed interactive applications, and a novel application-level multicast protocol that realizes efficient group communication while taking application-level knowledge into account. Second, consistency control mechanisms are required to keep the replicated application data synchronized. We present the consistency control algorithms “local lag”, “Timewarp”, and “state request”, show how they can be combined, and discuss how to provide visual feedback so that the session members are able to handle conflicting actions. Finally, late-joining participants need to be initialized with the current application state before they are able to participate in a collaborative session. We propose a novel late-join algorithm, which is both flexible and scalable. All algorithms and protocols presented in this dissertation solve the aforementioned problems in a generic way. We demonstrate how they can be employed for the mlb as well as for other distributed interactive applications
    corecore