883 research outputs found

    Simulating fog and edge computing scenarios: an overview and research challenges

    Get PDF
    The fourth industrial revolution heralds a paradigm shift in how people, processes, things, data and networks communicate and connect with each other. Conventional computing infrastructures are struggling to satisfy dramatic growth in demand from a deluge of connected heterogeneous endpoints located at the edge of networks while, at the same time, meeting quality of service levels. The complexity of computing at the edge makes it increasingly difficult for infrastructure providers to plan for and provision resources to meet this demand. While simulation frameworks are used extensively in the modelling of cloud computing environments in order to test and validate technical solutions, they are at a nascent stage of development and adoption for fog and edge computing. This paper provides an overview of challenges posed by fog and edge computing in relation to simulation

    Addressing the Challenges in Federating Edge Resources

    Full text link
    This book chapter considers how Edge deployments can be brought to bear in a global context by federating them across multiple geographic regions to create a global Edge-based fabric that decentralizes data center computation. This is currently impractical, not only because of technical challenges, but is also shrouded by social, legal and geopolitical issues. In this chapter, we discuss two key challenges - networking and management in federating Edge deployments. Additionally, we consider resource and modeling challenges that will need to be addressed for a federated Edge.Comment: Book Chapter accepted to the Fog and Edge Computing: Principles and Paradigms; Editors Buyya, Sriram

    Integrating Blockchain with Fog and Edge Computing for Micropayment Systems

    Get PDF
    Fog computing eliminates the need for a centralized cloud data center as it allows computation to be done by nodes closer to the edge. This helps in improving scalability, latency, and throughput compared to traditional cloud environment. Furthermore, with massive increase of IoT devices in the coming future, current solutions that consider centralized cloud computing may not be suitable. Blockchain has developed as a powerful technology enabling unlimited application and opportunities during the last decade. As both blockchain and fog computing technologies operate on a decentralized framework for operations, their integration can help in driving many technologies forward and provide tremendous advantage in terms of security and cost. Recently, micropayments are adopted into a large number of applications. However, individually processing micropayments will result in higher transaction fees where in some cases transaction fee can exceed the payment value. Due to this reason, traditional cryptocurrency blockchain like Bitcoins is inappropriate for micropayment transactions. As such, using fog computing for micropayment can improve the latency and scalability. On the other side, the increased speed and connection density offered by 5G technology will enable real-time processing of data as well as automated transaction processing between connected devices. The 5G technology will enable the smart devices to make micropayments by processing data more efficiently. This will have far-reaching impact on business financial management. The 6G networks will exhibit more heterogeneity than 5G enabling different types of devices to communicate in an efficient way. This will enhance the micropayment networks where different types of IoT devices will be able to connect and hence process payments and transactions in a more secure way. Integrating this intelligence with big data in blockchain and fog computing will change the traditional business models and support the creation of efficient and fast micropayment systems.This chapter explains the benefits of integrating modern technologies (fog computing, blockchain, 6G, and IoT) to solve the problem of micropayment systems. This is achieved by utilizing the capabilities of each technology (e.g., edge computing, blockchain, evolution of 5G to 6G) to bring intelligence from centralized computing facilities to edge/fog devices allowing for more envisioned applications such as micropayment systems where reliable, cheap, high speed, secure, and reduced latency transaction processing can be achieved. The chapter also highlights the various relationships among these technologies and surveys the most relevant work in order to analyze how the use of these disruptive technologies could potentially improve the micropayment systems functionality. Furthermore, various forms of integration of these technologies and associated applications are discussed, and solutions/challenges are outlined. The chapter also briefly discusses a generic solution to the problem of micropayments by integrating fog computing capabilities, blockchain, and edge computing to provide a practical payment setup that allows customers to issue micropayments in a convenient, fast, and secure manner

    Cooperative scheduling and load balancing techniques in fog and edge computing

    Get PDF
    Fog and Edge Computing are two models that reached maturity in the last decade. Today, they are two solid concepts and plenty of literature tried to develop them. Also corroborated by the development of technologies, like for example 5G, they can now be considered de facto standards when building low and ultra-low latency applications, privacy-oriented solutions, industry 4.0 and smart city infrastructures. The common trait of Fog and Edge computing environments regards their inherent distributed and heterogeneous nature where the multiple (Fog or Edge) nodes are able to interact with each other with the essential purpose of pre-processing data gathered by the uncountable number of sensors to which they are connected to, even by running significant ML models and relying upon specific processors (TPU). However, nodes are often placed in a geographic domain, like a smart city, and the dynamic of the traffic during the day may cause some nodes to be overwhelmed by requests while others instead may become completely idle. To achieve the optimal usage of the system and also to guarantee the best possible QoS across all the users connected to the Fog or Edge nodes, the need to design load balancing and scheduling algorithms arises. In particular, a reasonable solution is to enable nodes to cooperate. This capability represents the main objective of this thesis, which is the design of fully distributed algorithms and solutions whose purpose is the one of balancing the load across all the nodes, also by following, if possible, QoS requirements in terms of latency or imposing constraints in terms of power consumption when the nodes are powered by green energy sources. Unfortunately, when a central orchestrator is missing, a crucial element which makes the design of such algorithms difficult is that nodes need to know the state of the others in order to make the best possible scheduling decision. However, it is not possible to retrieve the state without introducing further latency during the service of the request. Furthermore, the retrieved information about the state is always old, and as a consequence, the decision is always relying on imprecise data. In this thesis, the problem is circumvented in two main ways. The first one considers randomised algorithms which avoid probing all of the neighbour nodes in favour of at maximum two nodes picked at random. This is proven to bring an exponential improvement in performance with respect to the probe of a single node. The second approach, instead, considers Reinforcement Learning as a technique for inferring the state of the other nodes thanks to the reward received by the agents when requests are forwarded. Moreover, the thesis will also focus on the energy aspect of the Edge devices. In particular, will be analysed a scenario of Green Edge Computing, where devices are powered only by Photovoltaic Panels and a scenario of mobile offloading targeting ML image inference applications. Lastly, a final glance will be given at a series of infrastructural studies, which will give the foundations for implementing the proposed algorithms on real devices, in particular, Single Board Computers (SBCs). There will be presented a structural scheme of a testbed of Raspberry Pi boards, and a fully-fledged framework called ``P2PFaaS'' which allows the implementation of load balancing and scheduling algorithms based on the Function-as-a-Service (FaaS) paradigm

    Edge computing and iot analytics for agile optimization in intelligent transportation systems

    Full text link
    [EN] With the emergence of fog and edge computing, new possibilities arise regarding the data-driven management of citizens' mobility in smart cities. Internet of Things (IoT) analytics refers to the use of these technologies, data, and analytical models to describe the current status of the city traffic, to predict its evolution over the coming hours, and to make decisions that increase the efficiency of the transportation system. It involves many challenges such as how to deal and manage real and huge amounts of data, and improving security, privacy, scalability, reliability, and quality of services in the cloud and vehicular network. In this paper, we review the state of the art of IoT in intelligent transportation systems (ITS), identify challenges posed by cloud, fog, and edge computing in ITS, and develop a methodology based on agile optimization algorithms for solving a dynamic ride-sharing problem (DRSP) in the context of edge/fog computing. These algorithms allow us to process, in real time, the data gathered from IoT systems in order to optimize automatic decisions in the city transportation system, including: optimizing the vehicle routing, recommending customized transportation modes to the citizens, generating efficient ride-sharing and car-sharing strategies, create optimal charging station for electric vehicles and different services within urban and interurban areas. A numerical example considering a DRSP is provided, in which the potential of employing edge/fog computing, open data, and agile algorithms is illustrated.This work was partially supported by the Spanish Ministry of Science (PID2019111100RB-C21/AEI/10.13039/501100011033, RED2018-102642-T), and the Erasmus+ program (2019I-ES01-KA103-062602).Peyman, M.; Copado, PJ.; Tordecilla, RD.; Do C. Martins, L.; Xhafa, F.; Juan-Pérez, ÁA. (2021). Edge computing and iot analytics for agile optimization in intelligent transportation systems. Energies. 14(19):1-26. https://doi.org/10.3390/en14196309126141

    Big Data Pipelines on the Computing Continuum: Ecosystem and Use Cases Overview

    Get PDF
    Organisations possess and continuously generate huge amounts of static and stream data, especially with the proliferation of Internet of Things technologies. Collected but unused data, i.e., Dark Data, mean loss in value creation potential. In this respect, the concept of Computing Continuum extends the traditional more centralised Cloud Computing paradigm with Fog and Edge Computing in order to ensure low latency pre-processing and filtering close to the data sources. However, there are still major challenges to be addressed, in particular related to management of various phases of Big Data processing on the Computing Continuum. In this paper, we set forth an ecosystem for Big Data pipelines in the Computing Continuum and introduce five relevant real-life example use cases in the context of the proposed ecosystem.acceptedVersio
    corecore