5,553 research outputs found
Streaming and User Behaviour in Omnidirectional Videos
Omnidirectional videos (ODVs) have gone beyond the passive paradigm of traditional video,
offering higher degrees of immersion and interaction. The revolutionary novelty of this technology is the possibility for users to interact with the surrounding environment, and to feel a
sense of engagement and presence in a virtual space. Users are clearly the main driving force of
immersive applications and consequentially the services need to be properly tailored to them.
In this context, this chapter highlights the importance of the new role of users in ODV streaming applications, and thus the need for understanding their behaviour while navigating within
ODVs. A comprehensive overview of the research efforts aimed at advancing ODV streaming
systems is also presented. In particular, the state-of-the-art solutions under examination in this
chapter are distinguished in terms of system-centric and user-centric streaming approaches: the
former approach comes from a quite straightforward extension of well-established solutions for
the 2D video pipeline while the latter one takes the benefit of understanding users’ behaviour
and enable more personalised ODV streaming
BOLA360: Near-optimal View and Bitrate Adaptation for 360-degree Video Streaming
Recent advances in omnidirectional cameras and AR/VR headsets have spurred
the adoption of 360-degree videos that are widely believed to be the future of
online video streaming. 360-degree videos allow users to wear a head-mounted
display (HMD) and experience the video as if they are physically present in the
scene. Streaming high-quality 360-degree videos at scale is an unsolved problem
that is more challenging than traditional (2D) video delivery. The data rate
required to stream 360-degree videos is an order of magnitude more than
traditional videos. Further, the penalty for rebuffering events where the video
freezes or displays a blank screen is more severe as it may cause
cybersickness. We propose an online adaptive bitrate (ABR) algorithm for
360-degree videos called BOLA360 that runs inside the client's video player and
orchestrates the download of video segments from the server so as to maximize
the quality-of-experience (QoE) of the user. BOLA360 conserves bandwidth by
downloading only those video segments that are likely to fall within the
field-of-view (FOV) of the user. In addition, BOLA360 continually adapts the
bitrate of the downloaded video segments so as to enable a smooth playback
without rebuffering. We prove that BOLA360 is near-optimal with respect to an
optimal offline algorithm that maximizes QoE. Further, we evaluate BOLA360 on a
wide range of network and user head movement profiles and show that it provides
to more QoE than state-of-the-art algorithms. While ABR
algorithms for traditional (2D) videos have been well-studied over the last
decade, our work is the first ABR algorithm for 360-degree videos with both
theoretical and empirical guarantees on its performance.Comment: 25 page
Machine Learning for Multimedia Communications
Machine learning is revolutionizing the way multimedia information is processed and transmitted to users. After intensive and powerful training, some impressive efficiency/accuracy improvements have been made all over the transmission pipeline. For example, the high model capacity of the learning-based architectures enables us to accurately model the image and video behavior such that tremendous compression gains can be achieved. Similarly, error concealment, streaming strategy or even user perception modeling have widely benefited from the recent learningoriented developments. However, learning-based algorithms often imply drastic changes to the way data are represented or consumed, meaning that the overall pipeline can be affected even though a subpart of it is optimized. In this paper, we review the recent major advances that have been proposed all across the transmission chain, and we discuss their potential impact and the research challenges that they raise
The influence of human factors on 360∘ mulsemedia QoE
Quality of Experience (QoE) is indelibly linked to the human side of the multimedia experience. Surprisingly, however, there is a paucity of research which explores the impact that human factors has in determining QoE. Whilst this is true of multimedia, it is even more starkly so as far as mulsemedia - applications that involve media engaging three or more of human senses - is concerned. Hence, in the study reported in this paper, we focus on an exciting subset of mulsemedia applications - 360∘ mulsemedia - particularly important given that the upcoming 5G technology is foreseen to be a key enabler for the proliferation of immersive Virtual Reality (VR) applications. Accordingly, we study the impact that human factors such as gender, age, prior computing experience, and smell sensitivity have on 360∘ mulsemedia QoE. Results showed insight into the potential of 360∘ mulsemedia to inspire and to enrich experiences for Generation Z - a generation empowered by rapidly advancing technology. Patterns of prior media usage and smell sensitivity play also an important role in influencing the QoE evaluation - users who have a preference for dynamic videos enjoy and find realistic the 360∘ mulsemedia experiences
Comparison between Famous Game Engines and Eminent Games
Nowadays game engines are imperative for building 3D applications and games. This is for the reason that the engines appreciably reduce resources for employing obligatory but intricate utilities. This paper elucidates about a game engine, popular games developed by these engines and its foremost elements. It portrays a number of special kinds of contemporary game developed by engines in the way of their aspects, procedure and deliberates their stipulations with comparison
A motion control method for a differential drive robot based on human walking for immersive telepresence
Abstract. This thesis introduces an interface for controlling Differential Drive Robots (DDRs) for telepresence applications. Our goal is to enhance immersive experience while reducing user discomfort, when using Head Mounted Displays (HMDs) and body trackers. The robot is equipped with a 360° camera that captures the Robot Environment (RE). Users wear an HMD and use body trackers to navigate within a Local Environment (LE). Through a live video stream from the robot-mounted camera, users perceive the RE within a virtual sphere known as the Virtual Environment (VE). A proportional controller was employed to facilitate the control of the robot, enabling to replicate the movements of the user. The proposed method uses chest tracker to control the telepresence robot and focuses on minimizing vection and rotations induced by the robot’s motion by modifying the VE, such as rotating and translating it. Experimental results demonstrate the accuracy of the robot in reaching target positions when controlled through the body-tracker interface. Additionally, it also reveals an optimal VE size that effectively reduces VR sickness and enhances the sense of presence
- …