1,426 research outputs found

    Leveraging context-awareness to better support the IoT cloud-edge continuum

    Get PDF
    Novel Internet of Things (IoT) requirements derived from a broader interconnection of heterogeneous devices have pushed the horizons of Cloud computing and are giving rise to a wider decentralisation of applications and data centers. An answer to the underlying network concerns, such as the need to lower the resulting latency due to heavy computation needs, or safety aspects, gave rise to Edge/Fog computing, where IoT functionality can be also supported closer to data sources. While it is today feasible to perform some IoT functionality on the Edge, the orchestration of operations between Edge and Cloud requires an automated support, where context-awareness plays a key role in assisting the network in deciding when and where to store data and to perform computation. This work is focused on the application of context-awareness to support a smoother operation of the Edge to Cloud operation, aiming at lowering latency, in particular when real-time or close-to-real-time data exchange is present.info:eu-repo/semantics/publishedVersio

    A reference architecture for cloud-edge meta-operating systems enabling cross-domain, data-intensive, ML-assisted applications: architectural overview and key concepts

    Get PDF
    Future data-intensive intelligent applications are required to traverse across the cloudto-edge-to-IoT continuum, where cloud and edge resources elegantly coordinate, alongside sensor networks and data. However, current technical solutions can only partially handle the data outburst associated with the IoT proliferation experienced in recent years, mainly due to their hierarchical architectures. In this context, this paper presents a reference architecture of a meta-operating system (RAMOS), targeted to enable a dynamic, distributed and trusted continuum which will be capable of facilitating the next-generation smart applications at the edge. RAMOS is domain-agnostic, capable of supporting heterogeneous devices in various network environments. Furthermore, the proposed architecture possesses the ability to place the data at the origin in a secure and trusted manner. Based on a layered structure, the building blocks of RAMOS are thoroughly described, and the interconnection and coordination between them is fully presented. Furthermore, illustration of how the proposed reference architecture and its characteristics could fit in potential key industrial and societal applications, which in the future will require more power at the edge, is provided in five practical scenarios, focusing on the distributed intelligence and privacy preservation principles promoted by RAMOS, as well as the concept of environmental footprint minimization. Finally, the business potential of an open edge ecosystem and the societal impacts of climate net neutrality are also illustrated.For UPC authors: this research was funded by the Spanish Ministry of Science, Innovation and Universities and FEDER, grant number PID2021-124463OB-100.Peer ReviewedPostprint (published version

    Edge Offloading in Smart Grid

    Full text link
    The energy transition supports the shift towards more sustainable energy alternatives, paving towards decentralized smart grids, where the energy is generated closer to the point of use. The decentralized smart grids foresee novel data-driven low latency applications for improving resilience and responsiveness, such as peer-to-peer energy trading, microgrid control, fault detection, or demand response. However, the traditional cloud-based smart grid architectures are unable to meet the requirements of the new emerging applications such as low latency and high-reliability thus alternative architectures such as edge, fog, or hybrid need to be adopted. Moreover, edge offloading can play a pivotal role for the next-generation smart grid AI applications because it enables the efficient utilization of computing resources and addresses the challenges of increasing data generated by IoT devices, optimizing the response time, energy consumption, and network performance. However, a comprehensive overview of the current state of research is needed to support sound decisions regarding energy-related applications offloading from cloud to fog or edge, focusing on smart grid open challenges and potential impacts. In this paper, we delve into smart grid and computational distribution architec-tures, including edge-fog-cloud models, orchestration architecture, and serverless computing, and analyze the decision-making variables and optimization algorithms to assess the efficiency of edge offloading. Finally, the work contributes to a comprehensive understanding of the edge offloading in smart grid, providing a SWOT analysis to support decision making.Comment: to be submitted to journa

    Edge-to-cloud sensing and actuation semantics in the industrial Internet of Things

    Get PDF
    There are billions of devices worldwide deployed, connected, and communicating to other systems. Sensors and actuators, which can be stationary or movable devices. These Edge devices are considered part of the Internet of Things (IoT) devices, which can be referred to as a tier of the Computing Continuum paradigm. There are two main concerns at stake in the success of this ecosystem. The interoperability between devices and systems is the first. Mainly, because most of them communicate uniquely and differently from each other, leading to heterogeneous data. The second issue is the lack of decision-making capacity to conduct actuations, such as communicating through different computing tiers based on latency constraints due to a certain measured factor. In this article, we propose an ontology to improve device interoperability in the IoT. In addition, we also explain how to ease data communication between Computing Continuum devices, providing tools to enhance data management and decision-making. A use case is also presented, using the automotive industry, where quickness in maneuver determination is key to avoid accidents. It is exemplified using two Raspberry Pi devices, connected using different networks and choosing the appropriate one depending on context-aware conditions.This work is partially funded by: Industrial Doctorates (2019 DI 001) from Generalitat de Catalunya. The SUDOQU project (PID2021-127181OB-I00) from MCIN/AEI. FEDER “Una manera de hacer Europa”; and project 2017-SGR-1749 from Generalitat de Catalunya. Also with the support of inLab FIB at UPC and Worldsensing.Peer ReviewedPostprint (published version

    Creating Intelligent Computational Edge through Semantic Mediation

    Get PDF
    This research proposes semantic mediation based on reasoning and the first order logic for mediating the best possible configuration of Computational Edge, relevant for software applications which may benefit for running computations with proximity to their data sources. The mediation considers the context in which these applications exist and exploits the semantic of that context for decision making on where computational elements should reside and which data they should use. The application of semantic mediation could address the initiative to accommodate algorithms from predictive and learning technologies, push AI towards computational edges and potentially contribute towards creating a computing continuum

    Microservices and serverless functions – lifecycle, performance, and resource utilisation of edge based real-time IoT analytics

    Get PDF
    Edge Computing harnesses resources close to the data sources to reduce end-to-end latency and allow real-time process automation for verticals such as Smart City, Healthcare and Industry 4.0. Edge resources are limited when compared to traditional Cloud data centres; hence the choice of proper resource management strategies in this context becomes paramount. Microservice and Function as a Service architectures support modular and agile patterns, compared to a monolithic design, through lightweight containerisation, continuous integration / deployment and scaling. The advantages brought about by these technologies may initially seem obvious, but we argue that their usage at the Edge deserves a more in-depth evaluation. By analysing both the software development and deployment lifecycle, along with performance and resource utilisation, this paper explores microservices and two alternative types of serverless functions to build edge real-time IoT analytics. In the experiments comparing these technologies, microservices generally exhibit slightly better end-to-end processing latency and resource utilisation than serverless functions. One of the serverless functions and the microservices excel at handling larger data streams with auto-scaling. Whilst serverless functions natively offer this feature, the choice of container orchestration framework may determine its availability for microservices. The other serverless function, while supporting a simpler lifecycle, is more suitable for low-invocation scenarios and faces challenges with parallel requests and inherent overhead, making it less suitable for real-time processing in demanding IoT settings
    corecore