4,197 research outputs found

    Streaming Content from a Vehicular Cloud

    Get PDF
    International audienceNetwork densification via small cells is considered as a key step to cope with the data tsunami. Caching data at small cells or even user devices is also considered as a promising way to alleviate the backhaul congestion this densification might cause. However, the former suffers from high deployment and maintenance costs, and the latter from limited resources and privacy issues with user devices. We argue that an architecture with (public or private) vehicles acting as mobile caches and communication relays might be a promising middle ground. In this paper, we assume such a vehicular cloud is in place to provide video streaming to users, and that the operator can decide which content to store in the vehicle caches. Users can then greedily fill their playout buffer with video pieces of the streamed content from encountered vehicles, and turn to the infrastructure immediately when the playout buffer is empty, to ensure uninterrupted streaming. Our main contribution is to model the playout buffer in the user device with a queuing approach, and to provide a mathematical formulation for the idle periods of this buffer, which relate to the bytes downloadedfrom the cellular infrastructure. We also solve the resulting content allocation problem, and perform trace based simulations to finally show that up to 50% of the original traffic could be offloaded from the main infrastructure

    Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support.

    Get PDF
    A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends

    Applications of Fog Computing in Video Streaming

    Get PDF
    The purpose of this paper is to show the viability of fog computing in the area of video streaming in vehicles. With the rise of autonomous vehicles, there needs to be a viable entertainment option for users. The cloud fails to address these options due to latency problems experienced during high internet traffic. To improve video streaming speeds, fog computing seems to be the best option. Fog computing brings the cloud closer to the user through the use of intermediary devices known as fog nodes. It does not attempt to replace the cloud but improve the cloud by allowing faster upload and download of information. This paper explores two algorithms that would work well with vehicles and video streaming. This is simulated using a Java application, and then graphically represented. The results showed that the simulation was an accurate model and that the best algorithm for request history maintenance was the variable model

    The Price of Fog: a Data-Driven Study on Caching Architectures in Vehicular Networks

    Get PDF
    Vehicular users are expected to consume large amounts of data, for both entertainment and navigation purposes. This will put a strain on cellular networks, which will be able to cope with such a load only if proper caching is in place, this in turn begs the question of which caching architecture is the best-suited to deal with vehicular content consumption. In this paper, we leverage a large-scale, crowd-collected trace to (i) characterize the vehicular traffic demand, in terms of overall magnitude and content breakup, (ii) assess how different caching approaches perform against such a real-world load, (iii) study the effect of recommendation systems and local contents. We define a price-of-fog metric, expressing the additional caching capacity to deploy when moving from traditional, centralized caching architectures to a "fog computing" approach, where caches are closer to the network edge. We find that for location-specific contents, such as the ones that vehicular users are most likely to request, such a price almost disappears. Vehicular networks thus make a strong case for the adoption of mobile-edge caching, as we are able to reap the benefit thereof -- including a reduction in the distance traveled by data, within the core network -- with little or no of the associated disadvantages.Comment: ACM IoV-VoI 2016 MobiHoc Workshop, The 17th ACM International Symposium on Mobile Ad Hoc Networking and Computing: MobiHoc 2016-IoV-VoI Workshop, Paderborn, German
    • …
    corecore