8,376 research outputs found

    Applications of Fog Computing in Video Streaming

    Get PDF
    The purpose of this paper is to show the viability of fog computing in the area of video streaming in vehicles. With the rise of autonomous vehicles, there needs to be a viable entertainment option for users. The cloud fails to address these options due to latency problems experienced during high internet traffic. To improve video streaming speeds, fog computing seems to be the best option. Fog computing brings the cloud closer to the user through the use of intermediary devices known as fog nodes. It does not attempt to replace the cloud but improve the cloud by allowing faster upload and download of information. This paper explores two algorithms that would work well with vehicles and video streaming. This is simulated using a Java application, and then graphically represented. The results showed that the simulation was an accurate model and that the best algorithm for request history maintenance was the variable model

    ENORM: A Framework For Edge NOde Resource Management

    Get PDF
    Current computing techniques using the cloud as a centralised server will become untenable as billions of devices get connected to the Internet. This raises the need for fog computing, which leverages computing at the edge of the network on nodes, such as routers, base stations and switches, along with the cloud. However, to realise fog computing the challenge of managing edge nodes will need to be addressed. This paper is motivated to address the resource management challenge. We develop the first framework to manage edge nodes, namely the Edge NOde Resource Management (ENORM) framework. Mechanisms for provisioning and auto-scaling edge node resources are proposed. The feasibility of the framework is demonstrated on a PokeMon Go-like online game use-case. The benefits of using ENORM are observed by reduced application latency between 20% - 80% and reduced data transfer and communication frequency between the edge node and the cloud by up to 95\%. These results highlight the potential of fog computing for improving the quality of service and experience.Comment: 14 pages; accepted to IEEE Transactions on Services Computing on 12 September 201

    Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support.

    Get PDF
    A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends

    Wearable Communications in 5G: Challenges and Enabling Technologies

    Full text link
    As wearable devices become more ingrained in our daily lives, traditional communication networks primarily designed for human being-oriented applications are facing tremendous challenges. The upcoming 5G wireless system aims to support unprecedented high capacity, low latency, and massive connectivity. In this article, we evaluate key challenges in wearable communications. A cloud/edge communication architecture that integrates the cloud radio access network, software defined network, device to device communications, and cloud/edge technologies is presented. Computation offloading enabled by this multi-layer communications architecture can offload computation-excessive and latency-stringent applications to nearby devices through device to device communications or to nearby edge nodes through cellular or other wireless technologies. Critical issues faced by wearable communications such as short battery life, limited computing capability, and stringent latency can be greatly alleviated by this cloud/edge architecture. Together with the presented architecture, current transmission and networking technologies, including non-orthogonal multiple access, mobile edge computing, and energy harvesting, can greatly enhance the performance of wearable communication in terms of spectral efficiency, energy efficiency, latency, and connectivity.Comment: This work has been accepted by IEEE Vehicular Technology Magazin
    corecore