12,743 research outputs found

    Business Case and Technology Analysis for 5G Low Latency Applications

    Get PDF
    A large number of new consumer and industrial applications are likely to change the classic operator's business models and provide a wide range of new markets to enter. This article analyses the most relevant 5G use cases that require ultra-low latency, from both technical and business perspectives. Low latency services pose challenging requirements to the network, and to fulfill them operators need to invest in costly changes in their network. In this sense, it is not clear whether such investments are going to be amortized with these new business models. In light of this, specific applications and requirements are described and the potential market benefits for operators are analysed. Conclusions show that operators have clear opportunities to add value and position themselves strongly with the increasing number of services to be provided by 5G.Comment: 18 pages, 5 figure

    Single-Board-Computer Clusters for Cloudlet Computing in Internet of Things

    Get PDF
    The number of connected sensors and devices is expected to increase to billions in the near future. However, centralised cloud-computing data centres present various challenges to meet the requirements inherent to Internet of Things (IoT) workloads, such as low latency, high throughput and bandwidth constraints. Edge computing is becoming the standard computing paradigm for latency-sensitive real-time IoT workloads, since it addresses the aforementioned limitations related to centralised cloud-computing models. Such a paradigm relies on bringing computation close to the source of data, which presents serious operational challenges for large-scale cloud-computing providers. In this work, we present an architecture composed of low-cost Single-Board-Computer clusters near to data sources, and centralised cloud-computing data centres. The proposed cost-efficient model may be employed as an alternative to fog computing to meet real-time IoT workload requirements while keeping scalability. We include an extensive empirical analysis to assess the suitability of single-board-computer clusters as cost-effective edge-computing micro data centres. Additionally, we compare the proposed architecture with traditional cloudlet and cloud architectures, and evaluate them through extensive simulation. We finally show that acquisition costs can be drastically reduced while keeping performance levels in data-intensive IoT use cases.Ministerio de Economía y Competitividad TIN2017-82113-C2-1-RMinisterio de Economía y Competitividad RTI2018-098062-A-I00European Union’s Horizon 2020 No. 754489Science Foundation Ireland grant 13/RC/209

    Foggy clouds and cloudy fogs: a real need for coordinated management of fog-to-cloud computing systems

    Get PDF
    The recent advances in cloud services technology are fueling a plethora of information technology innovation, including networking, storage, and computing. Today, various flavors have evolved of IoT, cloud computing, and so-called fog computing, a concept referring to capabilities of edge devices and users' clients to compute, store, and exchange data among each other and with the cloud. Although the rapid pace of this evolution was not easily foreseeable, today each piece of it facilitates and enables the deployment of what we commonly refer to as a smart scenario, including smart cities, smart transportation, and smart homes. As most current cloud, fog, and network services run simultaneously in each scenario, we observe that we are at the dawn of what may be the next big step in the cloud computing and networking evolution, whereby services might be executed at the network edge, both in parallel and in a coordinated fashion, as well as supported by the unstoppable technology evolution. As edge devices become richer in functionality and smarter, embedding capacities such as storage or processing, as well as new functionalities, such as decision making, data collection, forwarding, and sharing, a real need is emerging for coordinated management of fog-to-cloud (F2C) computing systems. This article introduces a layered F2C architecture, its benefits and strengths, as well as the arising open and research challenges, making the case for the real need for their coordinated management. Our architecture, the illustrative use case presented, and a comparative performance analysis, albeit conceptual, all clearly show the way forward toward a new IoT scenario with a set of existing and unforeseen services provided on highly distributed and dynamic compute, storage, and networking resources, bringing together heterogeneous and commodity edge devices, emerging fogs, as well as conventional clouds.Peer ReviewedPostprint (author's final draft

    Do we all really know what a fog node is? Current trends towards an open definition

    Get PDF
    Fog computing has emerged as a promising technology that can bring cloud applications closer to the physical IoT devices at the network edge. While it is widely known what cloud computing is, how data centers can build the cloud infrastructure and how applications can make use of this infrastructure, there is no common picture on what fog computing and particularly a fog node, as its main building block, really is. One of the first attempts to define a fog node was made by Cisco, qualifying a fog computing system as a “mini-cloud” located at the edge of the network and implemented through a variety of edge devices, interconnected by a variety, mostly wireless, communication technologies. Thus, a fog node would be the infrastructure implementing the said mini-cloud. Other proposals have their own definition of what a fog node is, usually in relation to a specific edge device, a specific use case or an application. In this paper, we first survey the state of the art in technologies for fog computing nodes, paying special attention to the contributions that analyze the role edge devices play in the fog node definition. We summarize and compare the concepts, lessons learned from their implementation, and end up showing how a conceptual framework is emerging towards a unifying fog node definition. We focus on core functionalities of a fog node as well as in the accompanying opportunities and challenges towards their practical realization in the near future.Postprint (author's final draft

    Dense Moving Fog for Intelligent IoT: Key Challenges and Opportunities

    Get PDF
    As the ratification of 5G New Radio technology is being completed, enabling network architectures are expected to undertake a matching effort. Conventional cloud and edge computing paradigms may thus become insufficient in supporting the increasingly stringent operating requirements of \emph{intelligent~Internet-of-Things (IoT) devices} that can move unpredictably and at high speeds. Complementing these, the concept of fog emerges to deploy cooperative cloud-like functions in the immediate vicinity of various moving devices, such as connected and autonomous vehicles, on the road and in the air. Envisioning gradual evolution of these infrastructures toward the increasingly denser geographical distribution of fog functionality, we in this work put forward the vision of dense moving fog for intelligent IoT applications. To this aim, we review the recent powerful enablers, outline the main challenges and opportunities, and corroborate the performance benefits of collaborative dense fog operation in a characteristic use case featuring a connected fleet of autonomous vehicles.Comment: 7 pages, 5 figures, 1 table. The work has been accepted for publication in IEEE Communications Magazine, 2019. Copyright may be transferred without notice, after which this version may no longer be accessibl
    corecore