160 research outputs found

    Distributed microservices evaluation in edge computing

    Get PDF
    Abstract. Current Internet of Things applications rely on centralized cloud computing when processing data. Future applications, such as smart cities, homes, and vehicles, however, generate so much data that cloud computing is unable to provide the required Quality of Service. Thus, edge computing, which pulls data and related computation from distant data centers to the network edge, is seen as the way forward in the evolution of the Internet of Things. The traditional cloud applications, implemented as centralized server-side monoliths, may prove unfavorable for edge systems, due to the distributed nature of the network edge. On the other hand, the recent development practices of containerization and microservices seem like an attractive choice for edge application development. Containerization enables edge computing to use lightweight virtualized resources. Microservices modularize application on the functional level into small, independent packages. This thesis studies the impact of containers and distributed microservices on edge computing, based on service execution latency and energy consumption. Evaluation is done by developing a monolithic and a distributed microservice version of a user mobility analysis service. Both services are containerized with Docker and deployed on resource-constrained edge devices to conduct measurements in real-world settings. Collected results show that centralized monoliths provide lower latencies for small amounts of data, while distributed microservices are faster for large amounts of data. Partitioning services onto multiple edge devices is shown to increases energy consumption significantly.Hajautettujen mikropalveluiden arviointi reunalaskennassa. Tiivistelmä. Nykyiset Esineiden Internet -järjestelmät hyödyntävät keskitettyä pilvilaskentaa datan prosessointiin. Tulevaisuuden sovellusalueet, kuten älykkäät kaupungit, kodit ja ajoneuvot tuottavat kuitenkin niin paljon dataa, ettei pilvilaskenta pysty täyttämään tarvittavia sovelluspalveluiden laatukriteerejä. Pilvipohjainen sovellusten toteutus on osoittautunut sopimattomaksi hajautetuissa tietoliikenneverkoissa tiedonsiirron viiveiden takia. Täten laskennan ja datan siirtämistä tietoliikenneverkkojen päätepisteisiin reunalaskentaa varten pidetään tärkeänä osana Esineiden Internetin kehitystä. Pilvisovellusten perinteinen keskitetty monoliittinen toteutus saattaa osoittautua sopimattomiksi reunajärjestelmille tietoliikenneverkkojen hajautetun infrastruktuurin takia. Kontit ja mikropalvelut vaikuttavat houkuttelevilta vaihtoehdoilta reunasovellusten suunnitteluun ja toteutukseen. Kontit mahdollistavat reunalaskennalle kevyiden virtualisoitujen resurssien käytön ja mikropalvelut jakavat sovellukset toiminnallisella tasolla pienikokoisiin itsenäisiin osiin. Tässä työssä selvitetään konttien ja hajautettujen mikropalveluiden toteutustavan vaikutusta viiveeseen ja energiankulutukseen reunalaskennassa. Arviointi tehdään todellisessa ympäristössä toteuttamalla mobiilikäyttäjien liikkumista kaupunkialueella analysoiva keskitetty monoliittinen palvelu sekä vastaava hajautettu mikropalvelupohjainen toteutus. Molemmat versiot kontitetaan ja otetaan käyttöön verkon reunalaitteilla, joiden laskentateho on alhainen. Tuloksista nähdään, että keskitettyjen monoliittien viive on alhaisempi pienille datamäärille, kun taas hajautetut mikropalvelut ovat nopeampia suurille määrille dataa. Sovelluksen jakaminen usealle reunalaitteelle kasvatti energiankulutusta huomattavasti

    Microservices and serverless functions – lifecycle, performance, and resource utilisation of edge based real-time IoT analytics

    Get PDF
    Edge Computing harnesses resources close to the data sources to reduce end-to-end latency and allow real-time process automation for verticals such as Smart City, Healthcare and Industry 4.0. Edge resources are limited when compared to traditional Cloud data centres; hence the choice of proper resource management strategies in this context becomes paramount. Microservice and Function as a Service architectures support modular and agile patterns, compared to a monolithic design, through lightweight containerisation, continuous integration / deployment and scaling. The advantages brought about by these technologies may initially seem obvious, but we argue that their usage at the Edge deserves a more in-depth evaluation. By analysing both the software development and deployment lifecycle, along with performance and resource utilisation, this paper explores microservices and two alternative types of serverless functions to build edge real-time IoT analytics. In the experiments comparing these technologies, microservices generally exhibit slightly better end-to-end processing latency and resource utilisation than serverless functions. One of the serverless functions and the microservices excel at handling larger data streams with auto-scaling. Whilst serverless functions natively offer this feature, the choice of container orchestration framework may determine its availability for microservices. The other serverless function, while supporting a simpler lifecycle, is more suitable for low-invocation scenarios and faces challenges with parallel requests and inherent overhead, making it less suitable for real-time processing in demanding IoT settings

    DESIGNING DISTRIBUTED CONTROLLING TESTBED SYSTEM FOR SUPPLY CHAIN AND LOGISTICS IN AUTOMOTIVE INDUSTRY

    Get PDF
    The arrival of the era of autonomous robots is indisputable. In this paper, innovations in the distributed control systems realized by autonomous guided vehicles in the automotive industry are provided as proof of concept. The main goal of the considered distributed control system design is to bring all-in-one dependent and independent VDA 5050 compliant robots that are easily configurable and manageable with the web-based high-quality user interface responsive business-critical application. Special attention is paid to applying a platform to manage all autonomous IoT based robots in one seamless system. In addition, a "single point of truth" as one of the main issues of modern distributed controlled systems has been considered.

    Consuming data sources to generate actionable items

    Get PDF
    Plataforma que consumeixi sensors IoT i sistemes d'alertes per a generar accions de resposta relacionades amb els sistemes d'alerta. Per a demostrar els casos d'ús possibles s'incorporaran funcions requerides per Projectes Europeus, solucions comercials i solucions compatibles amb estàndards

    Performance Evaluation Metrics for Cloud, Fog and Edge Computing: A Review, Taxonomy, Benchmarks and Standards for Future Research

    Get PDF
    Optimization is an inseparable part of Cloud computing, particularly with the emergence of Fog and Edge paradigms. Not only these emerging paradigms demand reevaluating cloud-native optimizations and exploring Fog and Edge-based solutions, but also the objectives require significant shift from considering only latency to energy, security, reliability and cost. Hence, it is apparent that optimization objectives have become diverse and lately Internet of Things (IoT)-specific born objectives must come into play. This is critical as incorrect selection of metrics can mislead the developer about the real performance. For instance, a latency-aware auto-scaler must be evaluated through latency-related metrics as response time or tail latency; otherwise the resource manager is not carefully evaluated even if it can reduce the cost. Given such challenges, researchers and developers are struggling to explore and utilize the right metrics to evaluate the performance of optimization techniques such as task scheduling, resource provisioning, resource allocation, resource scheduling and resource execution. This is challenging due to (1) novel and multi-layered computing paradigm, e.g., Cloud, Fog and Edge, (2) IoT applications with different requirements, e.g., latency or privacy, and (3) not having a benchmark and standard for the evaluation metrics. In this paper, by exploring the literature, (1) we present a taxonomy of the various real-world metrics to evaluate the performance of cloud, fog, and edge computing; (2) we survey the literature to recognize common metrics and their applications; and (3) outline open issues for future research. This comprehensive benchmark study can significantly assist developers and researchers to evaluate performance under realistic metrics and standards to ensure their objectives will be achieved in the production environments

    Performance and efficiency optimization of multi-layer IoT edge architecture

    Get PDF
    Abstract. Internet of Things (IoT) has become a backbone technology that connects together various devices with diverse capabilities. It is a technology, which enables ubiquitously available digital services for end-users. IoT applications for mission-critical scenarios need strict performance indicators such as of latency, scalability, security and privacy. To fulfil these requirements, IoT also requires support from relevant enabling technologies, such as cloud, edge, virtualization and fifth generation mobile communication (5G) technologies. For Latency-critical applications and services, long routes between the traditional cloud server and end-devices (sensors /actuators) is not a feasible approach for computing at these data centres, although these traditional clouds provide very high computational and storage for current IoT system. MEC model can be used to overcome this challenge, which brings the CC computational capacity within or next on the access network base stations. However, the capacity to perform the most critical processes at the local network layer is often necessary to cope with the access network issues. Therefore, this thesis compares the two existing IoT models such as traditional cloud-IoT model, a MEC-based edge-cloud-IoT model, with proposed local edge-cloud-IoT model with respect to their performance and efficiency, using iFogSim simulator. The results consolidate our research team’s previous findings that utilizing the three-tier edge-IoT architecture, capable of optimally utilizing the computational capacity of each of the three tiers, is an effective measure to reduce energy consumption, improve end-to-end latency and minimize operational costs in latency-critical It applications
    corecore