9 research outputs found

    Influence of Narrative Elements on User Behaviour in Photorealistic Social VR

    Get PDF
    Social Virtual Reality (VR) applications are becoming the next big revolution in the field of remote communication. Social VR provides the possibility for participants to explore and interact with a virtual environments and objects, feelings of a full sense of immersion, and being together. Understanding how user behaviour is influenced by the shared virtual space and its elements becomes the key to design and optimize novel immersive experiences that take into account the interaction between users and virtual objects. This paper presents a behavioural analysis of user navigation trajectories in a 6 degrees of freedom, social VR movie. We analysed 48 user trajectories from a photorealistic telepresence experiment, in which subjects experience watching a crime movie together in VR. We investigate how users are affected by salient agents (i.e., virtual characters) and by the narrative elements of the VR movie (i.e., dialogues versus interactive part). We complete our assessment by conducting a statistical analysis on the collected data. Results indicate that user behaviour is affected by different narrative and interactive elements. We present our observations, and we draw conclusions on future paths for social VR experiences

    Data2MV - A user behaviour dataset for multi-view scenarios

    Get PDF
    The Data2MV dataset contains gaze fixation data obtained through experimental procedures from a total of 45 partic- ipants using an Intel RealSense F200 camera module and seven different video playlists. Each of the playlists had an approximate duration of 20 minutes and was viewed at least 17 times, with raw tracking data being recorded with a 0.05 second interval. The Data2MV dataset encompasses a total of 1.0 0 0.845 gaze fixations, gathered across a total of 128 exper- iments. It is also composed of 68.393 image frames, extracted from each of the 6 videos selected for these experiments, and an equal quantity of saliency maps, generated from aggregate fixation data. Software tools to obtain saliency maps and generate complementary plots are also provided as an open- source software package. The Data2MV dataset was publicly released to the research community on Mendeley Data and constitutes an important contribution to reduce the current scarcity of such data, particularly in immersive, multi-view streaming scenarios.info:eu-repo/semantics/publishedVersio

    Audiovisual Database with 360 Video and Higher-Order Ambisonics Audio for Perception, Cognition, Behavior, and QoE Evaluation Research

    Full text link
    Research into multi-modal perception, human cognition, behavior, and attention can benefit from high-fidelity content that may recreate real-life-like scenes when rendered on head-mounted displays. Moreover, aspects of audiovisual perception, cognitive processes, and behavior may complement questionnaire-based Quality of Experience (QoE) evaluation of interactive virtual environments. Currently, there is a lack of high-quality open-source audiovisual databases that can be used to evaluate such aspects or systems capable of reproducing high-quality content. With this paper, we provide a publicly available audiovisual database consisting of twelve scenes capturing real-life nature and urban environments with a video resolution of 7680x3840 at 60 frames-per-second and with 4th-order Ambisonics audio. These 360 video sequences, with an average duration of 60 seconds, represent real-life settings for systematically evaluating various dimensions of uni-/multi-modal perception, cognition, behavior, and QoE. The paper provides details of the scene requirements, recording approach, and scene descriptions. The database provides high-quality reference material with a balanced focus on auditory and visual sensory information. The database will be continuously updated with additional scenes and further metadata such as human ratings and saliency information.Comment: 6 pages, 2 figures, accepted and presented at the 2022 14th International Conference on Quality of Multimedia Experience (QoMEX). Database is publicly accessible at https://qoevave.github.io/database

    Exploring the impact of 360° movie cuts in users' attention

    Get PDF
    Virtual Reality (VR) has grown since the first devices for personal use became available on the market. However, the production of cinematographic content in this new medium is still in an early exploratory phase. The main reason is that cinematographic language in VR is still under development, and we still need to learn how to tell stories effectively. A key element in traditional film editing is the use of different cutting techniques, in order to transition seamlessly from one sequence to another. A fundamental aspect of these techniques is the placement and control over the camera. However, VR content creators do not have full control of the camera. Instead, users in VR can freely explore the 360° of the scene around them, which potentially leads to very different experiences. While this is desirable in certain applications such as VR games, it may hinder the experience in narrative VR. In this work, we perform a systematic analysis of users'' viewing behavior across cut boundaries while watching professionally edited, narrative 360° videos. We extend previous metrics for quantifying user behavior in order to support more complex and realistic footage, and we introduce two new metrics that allow us to measure users'' exploration in a variety of different complex scenarios. From this analysis, (i) we confirm that previous insights derived for simple content hold for professionally edited content, and (ii) we derive new insights that could potentially influence VR content creation, informing creators about the impact of different cuts in the audience's behavior

    Influence of narrative elements on user behaviour in photorealistic social VR

    Get PDF
    Social Virtual Reality (VR) applications represent a big step forward in the field of remote communication. Social VR provides the possibility for participants to explore and interact with virtual environments and objects, feelings of a full sense of immersion, and being together. Understanding how user behaviour is influenced by the shared virtual space and its elements becomes the key to design and optimize novel immersive experiences. This paper presents a behavioural analysis of user navigating in 6 degrees of freedom social VR movie. Specifically, we analyse 48 user trajectories from a photorealistic telepresence experiment, in which subjects watch a crime movie together in VR. We investigate how users are affected by salient agents (i.e., virtual characters) and by narrative elements of the VR movie (i.e., dialogues versus interactive part). We complete our assessment by conducting a statistical analysis of the collected data. Results indicate that user behaviour is affected by different narrative and interactive elements. We conclude by presenting our observations and drawing conclusions on future paths for social VR experiences. This work has been supported by Royal Society under grant IES R1180128 and by Cisco under Cisco Research Center Donation scheme

    Do Users Behave Similarly in VR? Investigation of the User Influence on the System Design

    Get PDF
    With the overarching goal of developing user-centric Virtual Reality (VR) systems, a new wave of studies focused on understanding how users interact in VR environments has recently emerged. Despite the intense efforts, however, current literature still does not provide the right framework to fully interpret and predict users’ trajectories while navigating in VR scenes. This work advances the state-of-the-art on both the study of users’ behaviour in VR and the user-centric system design. In more detail, we complement current datasets by presenting a publicly available dataset that provides navigation trajectories acquired for heterogeneous omnidirectional videos and different viewing platforms—namely, head-mounted display, tablet, and laptop. We then present an exhaustive analysis on the collected data to better understand navigation in VR across users, content, and, for the first time, across viewing platforms. The novelty lies in the user-affinity metric, proposed in this work to investigate users’ similarities when navigating within the content. The analysis reveals useful insights on the effect of device and content on the navigation, which could be precious considerations from the system design perspective. As a case study of the importance of studying users’ behaviour when designing VR systems, we finally propose a user-centric server optimisation. We formulate an integer linear program that seeks the best stored set of omnidirectional content that minimises encoding and storage cost while maximising the user’s experience. This is posed while taking into account network dynamics, type of video content, and also user population interactivity. Experimental results prove that our solution outperforms common company recommendations in terms of experienced quality but also in terms of encoding and storage, achieving a savings up to 70%. More importantly, we highlight a strong correlation between the storage cost and the user-affinity metric, showing the impact of the latter in the system architecture design

    A Survey on Mobile Edge Computing for Video Streaming : Opportunities and Challenges

    Get PDF
    5G communication brings substantial improvements in the quality of service provided to various applications by achieving higher throughput and lower latency. However, interactive multimedia applications (e.g., ultra high definition video conferencing, 3D and multiview video streaming, crowd-sourced video streaming, cloud gaming, virtual and augmented reality) are becoming more ambitious with high volume and low latency video streams putting strict demands on the already congested networks. Mobile Edge Computing (MEC) is an emerging paradigm that extends cloud computing capabilities to the edge of the network i.e., at the base station level. To meet the latency requirements and avoid the end-to-end communication with remote cloud data centers, MEC allows to store and process video content (e.g., caching, transcoding, pre-processing) at the base stations. Both video on demand and live video streaming can utilize MEC to improve existing services and develop novel use cases, such as video analytics, and targeted advertisements. MEC is expected to reshape the future of video streaming by providing ultra-reliable and low latency streaming (e.g., in augmented reality, virtual reality, and autonomous vehicles), pervasive computing (e.g., in real-time video analytics), and blockchain-enabled architecture for secure live streaming. This paper presents a comprehensive survey of recent developments in MEC-enabled video streaming bringing unprecedented improvement to enable novel use cases. A detailed review of the state-of-the-art is presented covering novel caching schemes, optimal computation offloading, cooperative caching and offloading and the use of artificial intelligence (i.e., machine learning, deep learning, and reinforcement learning) in MEC-assisted video streaming services.publishedVersionPeer reviewe

    Understanding user interactivity for the next-generation immersive communication: design, optimisation, and behavioural analysis

    Get PDF
    Recent technological advances have opened the gate to a novel way to communicate remotely still feeling connected. In these immersive communications, humans are at the centre of virtual or augmented reality with a full sense of immersion and the possibility to interact with the new environment as well as other humans virtually present. These next-generation communication systems hide a huge potential that can invest in major economic sectors. However, they also posed many new technical challenges, mainly due to the new role of the final user: from merely passive to fully active in requesting and interacting with the content. Thus, we need to go beyond the traditional quality of experience research and develop user-centric solutions, in which the whole multimedia experience is tailored to the final interactive user. With this goal in mind, a better understanding of how people interact with immersive content is needed and it is the focus of this thesis. In this thesis, we study the behaviour of interactive users in immersive experiences and its impact on the next-generation multimedia systems. The thesis covers a deep literature review on immersive services and user centric solutions, before develop- ing three main research strands. First, we implement novel tools for behavioural analysis of users navigating in a 3-DoF Virtual Reality (VR) system. In detail, we study behavioural similarities among users by proposing a novel clustering algorithm. We also introduce information-theoretic metrics for quantifying similarities for the same viewer across contents. As second direction, we show the impact and advantages of taking into account user behaviour in immersive systems. Specifically, we formulate optimal user centric solutions i) from a server-side perspective and ii) a navigation aware adaptation logic for VR streaming platforms. We conclude by exploiting the aforementioned behavioural studies towards a more in- interactive immersive technology: a 6-DoF VR. Overall in this thesis, experimental results based on real navigation trajectories show key advantages of understanding any hidden patterns of user interactivity to be eventually exploited in engineering user centric solutions for immersive systems
    corecore