1,006 research outputs found

    Viewing the Future? Virtual Reality In Journalism

    Get PDF
    Journalism underwent a flurry of virtual reality content creation, production and distribution starting in the final months of 2015. The New York Times distributed more than 1 million cardboard virtual reality viewers and released an app showing a spherical video short about displaced refugees. The Los Angeles Times landed people next to a crater on Mars. USA TODAY took visitors on a ride-along in the "Back to the Future" car on the Universal Studios lot and on a spin through Old Havana in a bright pink '57 Ford. ABC News went to North Korea for a spherical view of a military parade and to Syria to see artifacts threatened by war. The Emblematic Group, a company that creates virtual reality content, followed a woman navigating a gauntlet of anti- abortion demonstrators at a family planning clinic and allowed people to witness a murder-suicide stemming from domestic violence.In short, the period from October 2015 through February 2016 was one of significant experimentation with virtual reality (VR) storytelling. These efforts are part of an initial foray into determining whether VR is a feasible way to present news. The year 2016 is shaping up as a period of further testing and careful monitoring of potential growth in the use of virtual reality among consumers

    Visual Distortions in 360-degree Videos.

    Get PDF
    Omnidirectional (or 360°) images and videos are emergent signals being used in many areas, such as robotics and virtual/augmented reality. In particular, for virtual reality applications, they allow an immersive experience in which the user can interactively navigate through a scene with three degrees of freedom, wearing a head-mounted display. Current approaches for capturing, processing, delivering, and displaying 360° content, however, present many open technical challenges and introduce several types of distortions in the visual signal. Some of the distortions are specific to the nature of 360° images and often differ from those encountered in classical visual communication frameworks. This paper provides a first comprehensive review of the most common visual distortions that alter 360° signals going through the different processing elements of the visual communication pipeline. While their impact on viewers' visual perception and the immersive experience at large is still unknown-thus, it is an open research topic-this review serves the purpose of proposing a taxonomy of the visual distortions that can be encountered in 360° signals. Their underlying causes in the end-to-end 360° content distribution pipeline are identified. This taxonomy is essential as a basis for comparing different processing techniques, such as visual enhancement, encoding, and streaming strategies, and allowing the effective design of new algorithms and applications. It is also a useful resource for the design of psycho-visual studies aiming to characterize human perception of 360° content in interactive and immersive applications

    A Measurement Study of Live 360 Video Streaming Systems

    Get PDF
    360-degree live video streaming is becoming increasingly popular. While providing viewers with enriched experience, 360-degree live video streaming is challenging to achieve since it requires a significantly higher bandwidth and a powerful computation infrastructure. A deeper understanding of this emerging system would benefit both viewers and system designers. Although prior works have extensively studied regular video streaming and 360-degree video on demand streaming, we for the first time investigate the performance of 360-degree live video streaming. We conduct a systematic measurement of YouTube’s 360-degree live video streaming using various metrics in multiple practical settings. Our research insight will help to build a clear understanding of today’s 360-degree live video streaming and lay a foundation for future research on this emerging yet relatively unexplored area. To further understand the delay measured in YouTube’s 360-degree live video streaming, we conduct the second measurement study on a 360-degree live video streaming platform. While live 360-degree video streaming provides an enriched viewing experience, it is challenging to guarantee the user experience against the negative effects introduced by start-up delay, event-to-eye delay, and low frame rate. It is therefore imperative to understand how different computing tasks of a live 360-degree streaming system contribute to these three delay metrics. Our measurement provide insights for future research directions towards improving the user experience of live 360-degree video streaming. Based on our measurement results, we propose a motion-based trajectory transmission method for 360-degree video streaming. First, we design a testbed for 360-degree video playback. The testbed can collect the users viewing data in real time. Then we analyze the trajectories of the moving targets in the 360-degree videos. Specifically, we utilize optical flow algorithms and gaussian mixture model to pinpoint the trajectories. Then we choose the trajectories to be delivered based on the size of the moving targets. The experiment results indicates that our method can obviously reduce the bandwidth consumption

    Live delivery of neurosurgical operating theater experience in virtual reality

    Get PDF
    A system for assisting in microneurosurgical training and for delivering interactive mixed reality surgical experience live was developed and experimented in hospital premises. An interactive experience from the neurosurgical operating theater was presented together with associated medical content on virtual reality eyewear of remote users. Details of the stereoscopic 360-degree capture, surgery imaging equipment, signal delivery, and display systems are presented, and the presence experience and the visual quality questionnaire results are discussed. The users reported positive scores on the questionnaire on topics related to the user experience achieved in the trial.Peer reviewe

    Capture4VR: From VR Photography to VR Video

    Get PDF

    Towards an Integration of 360-Degree Video in Higher Education. Workflow, challenges and scenarios

    Get PDF
    Today video is being used in different facets supporting the e-learning experience. With a resurging interest and reduced barriers of entry to experience virtual and augmented reality applications, 360-degree video technology is becoming relevant as an option to produce and consume content for VR/AR applications. 360-degree video offers new features, which can prove useful in teaching & learning scenarios with a need for self directed control of view direction, immersion and a feeling of presence. Current adoptions of 360-degree videos are integrated manually for specialized activity-oriented learning scenarios. However, in order to adopt 360- degree video on a larger scale, a sufficient technical integration is required and knowledge of application scenarios needs to be communicated. To approach this challenge, workflow steps are analyzed, challenges are identified and scenarios are described in the context of creating 360- degree video content for higher education. We identify open gaps, which need to be addressed in order to integrate 360-degree video technology in an automated video processing tool chain

    Foveated Video Streaming for Cloud Gaming

    Full text link
    Good user experience with interactive cloud-based multimedia applications, such as cloud gaming and cloud-based VR, requires low end-to-end latency and large amounts of downstream network bandwidth at the same time. In this paper, we present a foveated video streaming system for cloud gaming. The system adapts video stream quality by adjusting the encoding parameters on the fly to match the player's gaze position. We conduct measurements with a prototype that we developed for a cloud gaming system in conjunction with eye tracker hardware. Evaluation results suggest that such foveated streaming can reduce bandwidth requirements by even more than 50% depending on parametrization of the foveated video coding and that it is feasible from the latency perspective.Comment: Submitted to: IEEE 19th International Workshop on Multimedia Signal Processin
    • …
    corecore