80 research outputs found

    Spherical clustering of users navigating 360{\deg} content

    Full text link
    In Virtual Reality (VR) applications, understanding how users explore the omnidirectional content is important to optimize content creation, to develop user-centric services, or even to detect disorders in medical applications. Clustering users based on their common navigation patterns is a first direction to understand users behaviour. However, classical clustering techniques fail in identifying these common paths, since they are usually focused on minimizing a simple distance metric. In this paper, we argue that minimizing the distance metric does not necessarily guarantee to identify users that experience similar navigation path in the VR domain. Therefore, we propose a graph-based method to identify clusters of users who are attending the same portion of the spherical content over time. The proposed solution takes into account the spherical geometry of the content and aims at clustering users based on the actual overlap of displayed content among users. Our method is tested on real VR user navigation patterns. Results show that our solution leads to clusters in which at least 85% of the content displayed by one user is shared among the other users belonging to the same cluster.Comment: 5 pages, conference (Published in: ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)

    Dynamic Adaptive Point Cloud Streaming

    Full text link
    High-quality point clouds have recently gained interest as an emerging form of representing immersive 3D graphics. Unfortunately, these 3D media are bulky and severely bandwidth intensive, which makes it difficult for streaming to resource-limited and mobile devices. This has called researchers to propose efficient and adaptive approaches for streaming of high-quality point clouds. In this paper, we run a pilot study towards dynamic adaptive point cloud streaming, and extend the concept of dynamic adaptive streaming over HTTP (DASH) towards DASH-PC, a dynamic adaptive bandwidth-efficient and view-aware point cloud streaming system. DASH-PC can tackle the huge bandwidth demands of dense point cloud streaming while at the same time can semantically link to human visual acuity to maintain high visual quality when needed. In order to describe the various quality representations, we propose multiple thinning approaches to spatially sub-sample point clouds in the 3D space, and design a DASH Media Presentation Description manifest specific for point cloud streaming. Our initial evaluations show that we can achieve significant bandwidth and performance improvement on dense point cloud streaming with minor negative quality impacts compared to the baseline scenario when no adaptations is applied.Comment: 6 pages, 23rd ACM Packet Video (PV'18) Workshop, June 12--15, 2018, Amsterdam, Netherland

    360° mulsemedia experience over next generation wireless networks - a reinforcement learning approach

    Get PDF
    The next generation of wireless networks targets aspiring key performance indicators, like very low latency, higher data rates and more capacity, paving the way for new generations of video streaming technologies, such as 360° or omnidirectional videos. One possible application that could revolutionize the streaming technology is the 360° MULtiple SEnsorial MEDIA (MULSEMEDIA) which enriches the 360° video content with other media objects like olfactory, haptic or even thermoceptic ones. However, the adoption of the 360° Mulsemedia applications might be hindered by the strict Quality of Service (QoS) requirements, like very large bandwidth and low latency for fast responsiveness to the users, inputs that could impact their Quality of Experience (QoE). To this extent, this paper introduces the new concept of 360° Mulsemedia as well as it proposes the use of Reinforcement Learning to enable QoS provisioning over the next generation wireless networks that influences the QoE of the end-users

    360° mulsemedia experience over next generation wireless networks - a reinforcement learning approach

    Get PDF
    The next generation of wireless networks targets aspiring key performance indicators, like very low latency, higher data rates and more capacity, paving the way for new generations of video streaming technologies, such as 360° or omnidirectional videos. One possible application that could revolutionize the streaming technology is the 360° MULtiple SEnsorial MEDIA (MULSEMEDIA) which enriches the 360° video content with other media objects like olfactory, haptic or even thermoceptic ones. However, the adoption of the 360° Mulsemedia applications might be hindered by the strict Quality of Service (QoS) requirements, like very large bandwidth and low latency for fast responsiveness to the users, inputs that could impact their Quality of Experience (QoE). To this extent, this paper introduces the new concept of 360° Mulsemedia as well as it proposes the use of Reinforcement Learning to enable QoS provisioning over the next generation wireless networks that influences the QoE of the end-users

    An Edge and Fog Computing Platform for Effective Deployment of 360 Video Applications

    Get PDF
    This paper has been presented at: Seventh International Workshop on Cloud Technologies and Energy Efficiency in Mobile Communication Networks (CLEEN 2019). How cloudy and green will mobile network and services be? 15 April 2019 - Marrakech, MoroccoIn press / En prensaImmersive video applications based on 360 video streaming require high-bandwidth, high-reliability and lowlatency 5G connectivity but also flexible, low-latency and costeffective computing deployment. This paper proposes a novel solution for decomposing and distributing the end-to-end 360 video streaming service across three computing tiers, namely cloud, edge and constrained fog, in order of proximity to the end user client. The streaming service is aided with an adaptive viewport technique. The proposed solution is based on the H2020 5G-CORAL system architecture using micro-services-based design and a unified orchestration and control across all three tiers based on Fog05. Performance evaluation of the proposed solution shows noticeable reduction in bandwidth consumption, energy consumption, and deployment costs, as compared to a solution where the streaming service is all delivered out of one computing location such as the Cloud.This work has been partially funded by the H2020 collaborative Europe/Taiwan research project 5G-CORAL (grant num. 761586)
    corecore