103 research outputs found

    Edge Offloading in Smart Grid

    Full text link
    The energy transition supports the shift towards more sustainable energy alternatives, paving towards decentralized smart grids, where the energy is generated closer to the point of use. The decentralized smart grids foresee novel data-driven low latency applications for improving resilience and responsiveness, such as peer-to-peer energy trading, microgrid control, fault detection, or demand response. However, the traditional cloud-based smart grid architectures are unable to meet the requirements of the new emerging applications such as low latency and high-reliability thus alternative architectures such as edge, fog, or hybrid need to be adopted. Moreover, edge offloading can play a pivotal role for the next-generation smart grid AI applications because it enables the efficient utilization of computing resources and addresses the challenges of increasing data generated by IoT devices, optimizing the response time, energy consumption, and network performance. However, a comprehensive overview of the current state of research is needed to support sound decisions regarding energy-related applications offloading from cloud to fog or edge, focusing on smart grid open challenges and potential impacts. In this paper, we delve into smart grid and computational distribution architec-tures, including edge-fog-cloud models, orchestration architecture, and serverless computing, and analyze the decision-making variables and optimization algorithms to assess the efficiency of edge offloading. Finally, the work contributes to a comprehensive understanding of the edge offloading in smart grid, providing a SWOT analysis to support decision making.Comment: to be submitted to journa

    Optimizing task allocation for edge compute micro-clusters

    Get PDF
    There are over 30 billion devices at the network edge. This is largely driven by the unprecedented growth of the Internet-of-Things (IoT) and 5G technologies. These devices are being used in various applications and technologies, including but not limited to smart city systems, innovative agriculture management systems, and intelligent home systems. Deployment issues like networking and privacy problems dictate that computing should occur close to the data source at or near the network edge. Edge and fog computing are recent decentralised computing paradigms proposed to augment cloud services by extending computing and storage capabilities to the network’s edge to enable executing computational workloads locally. The benefits can help to solve issues such as reducing the strain on networking backhaul, improving network latency and enhancing application responsiveness. Many edge and fog computing deployment solutions and infrastructures are being employed to deliver cloud resources and services at the edge of the network — for example, cloudless and mobile edge computing. This thesis focuses on edge micro-cluster platforms for edge computing. Edge computing micro-cluster platforms are small, compact, and decentralised groups of interconnected computing resources located close to the edge of a network. These micro-clusters can typically comprise a variety of heterogeneous but resource-constrained computing resources, such as small compute nodes like Single Board Computers (SBCs), storage devices, and networking equipment deployed in local area networks such as smart home management. The goal of edge computing micro-clusters is to bring computation and data storage closer to IoT devices and sensors to improve the performance and reliability of distributed systems. Resource management and workload allocation represent a substantial challenge for such resource-limited and heterogeneous micro-clusters because of diversity in system architecture. Therefore, task allocation and workload management are complex problems in such micro-clusters. This thesis investigates the feasibility of edge micro-cluster platforms for edge computation. Specifically, the thesis examines the performance of micro-clusters to execute IoT applications. Furthermore, the thesis involves the evaluation of various optimisation techniques for task allocation and workload management in edge compute micro-cluster platforms. This thesis involves the application of various optimisation techniques, including simple heuristics-based optimisations, mathematical-based optimisation and metaheuristic optimisation techniques, to optimise task allocation problems in reconfigurable edge computing micro-clusters. The implementation and performance evaluations take place in a configured edge realistic environment using a constructed micro-cluster system comprised of a group of heterogeneous computing nodes and utilising a set of edge-relevant applications benchmark. The research overall characterises and demonstrates a feasible use case for micro-cluster platforms for edge computing environments and provides insight into the performance of various task allocation optimisation techniques for such micro-cluster systems

    ROUTER:Fog Enabled Cloud based Intelligent Resource Management Approach for Smart Home IoT Devices

    Get PDF
    There is a growing requirement for Internet of Things (IoT) infrastructure to ensure low response time to provision latency-sensitive real-time applications such as health monitoring, disaster management, and smart homes. Fog computing offers a means to provide such requirements, via a virtualized intermediate layer to provide data, computation, storage, and networking services between Cloud datacenters and end users. A key element within such Fog computing environments is resource management. While there are existing resource manager in Fog computing, they only focus on a subset of parameters important to Fog resource management encompassing system response time, network bandwidth, energy consumption and latency. To date no existing Fog resource manager considers these parameters simultaneously for decision making, which in the context of smart homes will become increasingly key. In this paper, we propose a novel resource management technique (ROUTER) for fog-enabled Cloud computing environments, which leverages Particle Swarm Optimization to optimize simultaneously. The approach is validated within an IoT-based smart home automation scenario, and evaluated within iFogSim toolkit driven by empirical models within a small-scale smart home experiment. Results demonstrate our approach results a reduction of 12% network bandwidth, 10% response time, 14% latency and 12.35% in energy consumption

    A Survey and Future Directions on Clustering: From WSNs to IoT and Modern Networking Paradigms

    Get PDF
    Many Internet of Things (IoT) networks are created as an overlay over traditional ad-hoc networks such as Zigbee. Moreover, IoT networks can resemble ad-hoc networks over networks that support device-to-device (D2D) communication, e.g., D2D-enabled cellular networks and WiFi-Direct. In these ad-hoc types of IoT networks, efficient topology management is a crucial requirement, and in particular in massive scale deployments. Traditionally, clustering has been recognized as a common approach for topology management in ad-hoc networks, e.g., in Wireless Sensor Networks (WSNs). Topology management in WSNs and ad-hoc IoT networks has many design commonalities as both need to transfer data to the destination hop by hop. Thus, WSN clustering techniques can presumably be applied for topology management in ad-hoc IoT networks. This requires a comprehensive study on WSN clustering techniques and investigating their applicability to ad-hoc IoT networks. In this article, we conduct a survey of this field based on the objectives for clustering, such as reducing energy consumption and load balancing, as well as the network properties relevant for efficient clustering in IoT, such as network heterogeneity and mobility. Beyond that, we investigate the advantages and challenges of clustering when IoT is integrated with modern computing and communication technologies such as Blockchain, Fog/Edge computing, and 5G. This survey provides useful insights into research on IoT clustering, allows broader understanding of its design challenges for IoT networks, and sheds light on its future applications in modern technologies integrated with IoT.acceptedVersio

    On the use of intelligent models towards meeting the challenges of the edge mesh

    Get PDF
    Nowadays, we are witnessing the advent of the Internet of Things (IoT) with numerous devices performing interactions between them or with their environment. The huge number of devices leads to huge volumes of data that demand the appropriate processing. The “legacy” approach is to rely on Cloud where increased computational resources can realize any desired processing. However, the need for supporting real-time applications requires a reduced latency in the provision of outcomes. Edge Computing (EC) comes as the “solver” of the latency problem. Various processing activities can be performed at EC nodes having direct connection with IoT devices. A number of challenges should be met before we conclude a fully automated ecosystem where nodes can cooperate or understand their status to efficiently serve applications. In this article, we perform a survey of the relevant research activities towards the vision of Edge Mesh (EM), i.e., a “cover” of intelligence upon the EC. We present the necessary hardware and discuss research outcomes in every aspect of EC/EM nodes functioning. We present technologies and theories adopted for data, tasks, and resource management while discussing how machine learning and optimization can be adopted in the domain

    A Survey on Scheduling the Task in Fog Computing Environment

    Full text link
    With the rapid increase in the Internet of Things (IoT), the amount of data produced and processed is also increased. Cloud Computing facilitates the storage, processing, and analysis of data as needed. However, cloud computing devices are located far away from the IoT devices. Fog computing has emerged as a small cloud computing paradigm that is near to the edge devices and handles the task very efficiently. Fog nodes have a small storage capability than the cloud node but it is designed and deployed near to the edge device so that request must be accessed efficiently and executes in time. In this survey paper we have investigated and analysed the main challenges and issues raised in scheduling the task in fog computing environment. To the best of our knowledge there is no comprehensive survey paper on challenges in task scheduling of fog computing paradigm. In this survey paper research is conducted from 2018 to 2021 and most of the paper selection is done from 2020-2021. Moreover, this survey paper organizes the task scheduling approaches and technically plans the identified challenges and issues. Based on the identified issues, we have highlighted the future work directions in the field of task scheduling in fog computing environment

    VOICE: Value-of-Information for Compute Continuum Ecosystems

    Get PDF
    The increasing ubiquity of Internet-of-Things (IoT) devices has led to a massive flow of data, often processed in distant and costly Cloud facilities. Emerging methodologies, such as Fog and Edge Computing, pursue to process data closer to its origin, thereby reducing the need for extensive use of Cloud solutions. The {"}Compute Continuum{"} (CC), which encompasses these paradigms, rises as an opportunity for an efficient deployment and adoption of innovative services ranging from eHealth to virtual reality. However, the decentralized nature of CC presents challenges, especially for orchestration tools conventionally designed for centralized scenarios. Tackling these challenges, we introduce VOICE, a cutting-edge platform designed for service management in CC ecosystems. Thanks to a Value of Information (VoI)-based technique that emphasizes the significance of data filtering, VOICE supports dynamic service components allocation throughout the continuum. To evaluate the choice of VoI as enabling-technology for VOICE, we leverage Computational Intelligence (CI) approaches and compare it with more popular strategies (e.g. maximization of Processing Ratio). Final results show how VoI can be a valuable proposal to optimize the usage of resource scarcity that characterizes CC environments
    corecore