1 research outputs found

    User-Adaptive Editing for 360 degree Video Streaming with Deep Reinforcement Learning

    Get PDF
    International audienceThe development through streaming of 360°videos is persistently hindered by how much bandwidth they require. Adapting spatially the quality of the sphere to the user's Field of View (FoV) lowers the data rate but requires to keep the playback buffer small, to predict the user's motion or to make replacements to keep the buffered qualities up to date with the moving FoV, all three being uncertain and risky. We have previously shown that opportunistically regaining control on the FoV with active attention-driving techniques makes for additional levers to ease streaming and improve Quality of Experience (QoE). Deep neural networks have been recently shown to achieve best performance for video streaming adaptation and head motion prediction. This demo presents a step ahead in the important investigation of deep neural network approaches to obtain user-adaptive and network-adaptive 360°video streaming systems. In this demo, we show how snap-changes, an attention-driving technique, can be automatically modulated by the user's motion to improve the streaming QoE. The control of snap-changes is made with a deep neural network trained on head motion traces with the Deep Reinforcement Learning strategy A3C
    corecore