1,182 research outputs found

    A Study of User Perception of the Quality of Video Content Rendered Inside a 3-D Virtual Environment

    Get PDF
    © 2016 IEEE. This paper reports on the result of a user study to assess the impact of resolution and frame rate of video on the quality of experience of the users, when the video is rendered inside a 3-D virtual space, and consequently viewed from arbitrary perspectives. A mathematical model for video rate is presented that expresses the total rate as the product of separate functions of spatial and temporal resolutions. Results from the user study are combined with the model to predict the rate parameters which will result in perceptually acceptable quality using the 3-D features of the virtual environment. The results show that by exploiting the insensitivity of users to controlled quality degradation, the downstream network load for the client can be significantly reduced with little or no perceptual impact on the clients

    Minimisation of video downstream bit rate for large scale immersive video conferencing by utilising the perceptual variations of quality

    Get PDF
    © 2014 IEEE. This paper aims at minimising the video downstream bit rate of immersive video conferencing (IVC) applications by judiciously modifying the video quality based on the relative virtual positions of participants in the virtual environment. The paper reports on the results of a user study to assess the influence of participants' perspectives on the perceptual impact of relevant video parameters, such as resolution and frame rate. A mathematical model for video rate is proposed that expresses the total rate as the product of spatial resolution and frame rate. Results from the user study are combined with the proposed model to predict the rate parameters which will result in perceptually acceptable quality for a given user perspective. The simulation results show that by exploiting the proposed method, the downstream network load for the client can be significantly reduced with little or no impact on the perceived quality

    Geometry-based spherical JND modeling for 360∘^\circ display

    Full text link
    360∘^\circ videos have received widespread attention due to its realistic and immersive experiences for users. To date, how to accurately model the user perceptions on 360∘^\circ display is still a challenging issue. In this paper, we exploit the visual characteristics of 360∘^\circ projection and display and extend the popular just noticeable difference (JND) model to spherical JND (SJND). First, we propose a quantitative 2D-JND model by jointly considering spatial contrast sensitivity, luminance adaptation and texture masking effect. In particular, our model introduces an entropy-based region classification and utilizes different parameters for different types of regions for better modeling performance. Second, we extend our 2D-JND model to SJND by jointly exploiting latitude projection and field of view during 360∘^\circ display. With this operation, SJND reflects both the characteristics of human vision system and the 360∘^\circ display. Third, our SJND model is more consistent with user perceptions during subjective test and also shows more tolerance in distortions with fewer bit rates during 360∘^\circ video compression. To further examine the effectiveness of our SJND model, we embed it in Versatile Video Coding (VVC) compression. Compared with the state-of-the-arts, our SJND-VVC framework significantly reduced the bit rate with negligible loss in visual quality

    Perceptual Quality Assessment of Omnidirectional Audio-visual Signals

    Full text link
    Omnidirectional videos (ODVs) play an increasingly important role in the application fields of medical, education, advertising, tourism, etc. Assessing the quality of ODVs is significant for service-providers to improve the user's Quality of Experience (QoE). However, most existing quality assessment studies for ODVs only focus on the visual distortions of videos, while ignoring that the overall QoE also depends on the accompanying audio signals. In this paper, we first establish a large-scale audio-visual quality assessment dataset for omnidirectional videos, which includes 375 distorted omnidirectional audio-visual (A/V) sequences generated from 15 high-quality pristine omnidirectional A/V contents, and the corresponding perceptual audio-visual quality scores. Then, we design three baseline methods for full-reference omnidirectional audio-visual quality assessment (OAVQA), which combine existing state-of-the-art single-mode audio and video QA models via multimodal fusion strategies. We validate the effectiveness of the A/V multimodal fusion method for OAVQA on our dataset, which provides a new benchmark for omnidirectional QoE evaluation. Our dataset is available at https://github.com/iamazxl/OAVQA.Comment: 12 pages, 5 figures, to be published in CICAI202

    On the influence of individual characteristics and personality traits on the user experience with multi-sensorial media: an experimental insight

    Get PDF
    Recent studies encourage the development of sensorially-enriched media to enhance the user experience by stimulating senses other than sight and hearing. Sensory effects as odor, wind, vibration and light effects, as well as an enhanced audio quality, have been found to favour media enjoyment and to have a positive influence on the sense of Presence and on the perceived quality, relevance and reality of a multimedia experience. In particular, sports is among the genres that could benefit the most from these solutions. Several works have demonstrated also the technical feasibility of implementing and deploying end-to-end solutions integrating sensory effects into a legacy system. Thus, multi-sensorial media emerges as a mean to deliver a new form of immersive experiences to the mass market in a non-disruptive manner. However, many questions remain concerning issues as the sensory effects that can better complement a given audiovisual content or the best way in which to integrate and combine them to enhance the user experience of a target audience segment. The work presented in this paper aims to gain insight into the impact of binaural audio and sensory (light and olfactory) effects on the sports media experience, both at the overall level (average effect) and as a function of users? characteristics (heterogeneous effects). To this aim, we conducted an experimental study exploring the influence of these immersive elements on the quality and Presence dimensions of the media experience. Along the quality dimension, we look for possible variations on the quality scores assigned to the overall media experience and to the media components content, image, audio and sensory effects. The potential impact on Presence is analyzed in terms of Spatial Presence and Engagement. The users? characteristics considered encompass specific personal affective, cognitive and behavioral attributes. We found that, on average, participants preferred binaural audio than standard stereo audio. The audio quality was found to have a heterogeneous impact on the quality of experience, on the perceived quality of content and image and on the levels of Spatial Presence and Engagement. Furthermore, the presence of sensory effects increased significantly the level of Spatial Presence. Additionally, highly conscious participants reported a significantly higher image quality when sensory effects were present in comparison to those conditions in which sensory effects were not administered. Personal characteristics explained most of the variation in the dependent variables, being individuals? preferences in relation to the content, knowledge of involved technologies, tendency to emotional involvement and conscientiousness among the user variables with the most generalized influence
    • …
    corecore