5,142 research outputs found

    Secure Cloud-Edge Deployments, with Trust

    Get PDF
    Assessing the security level of IoT applications to be deployed to heterogeneous Cloud-Edge infrastructures operated by different providers is a non-trivial task. In this article, we present a methodology that permits to express security requirements for IoT applications, as well as infrastructure security capabilities, in a simple and declarative manner, and to automatically obtain an explainable assessment of the security level of the possible application deployments. The methodology also considers the impact of trust relations among different stakeholders using or managing Cloud-Edge infrastructures. A lifelike example is used to showcase the prototyped implementation of the methodology

    Human-driven application management at the Edge

    Get PDF
    The design and management of Edge systems will proactively involve human intelligence at the Edge, according to a human-driven approach that increases productivity and improves usability. Due to its ubiquity and heterogeneity, the Edge will give to application administrators a more decisional role in application deployment and resource management. Final decisions on where to distribute application components should be informedly taken by them during the entire application lifecycle, accounting for compliance to QoS requirements. As a first step, this requires devising new tools that suitably abstract heterogeneity of edge systems, permit simulating different runtime scenarios and ease human-driven management of such systems by providing meaningful evaluation metrics. In this article, we discuss how human decision-making can be supported to solve QoS-aware management related challenges for Edge computing

    On the Deployment of IoT Systems: An Industrial Survey

    Get PDF
    Internet of Things (IoT) systems are complex and multifaceted, and the design of their architectures needs to consider many aspects at a time. Design decisions concern, for instance, the modeling of software components and their interconnections, as well as where to deploy the components within the available hardware infrastructure in the Edge-Cloud continuum. A relevant and challenging task, in this context, is to identify optimal deployment models due to all the different aspects involved, such as extra-functional requirements of the system, heterogeneity of the hardware resources concerning their processing and storage capabilities, and constraints like legal issues and operational cost limits. To gain insights about the deployment decisions concerning IoT systems in practice, and the factors that influence those decisions, we report about an industrial survey we conducted with 66 IoT architects from 18 countries across the world. Each participant filled in a questionnaire that comprises 15 questions. By analyzing the collected data, we have two main findings: (i) architects rely on the Cloud more than the Edge for deploying the software components of IoT systems, in the majority of the IoT application domains; and (ii) the main factors driving deployment decisions are four: reliability, performance, security, and cost

    How to Place Your Apps in the Fog -- State of the Art and Open Challenges

    Full text link
    Fog computing aims at extending the Cloud towards the IoT so to achieve improved QoS and to empower latency-sensitive and bandwidth-hungry applications. The Fog calls for novel models and algorithms to distribute multi-service applications in such a way that data processing occurs wherever it is best-placed, based on both functional and non-functional requirements. This survey reviews the existing methodologies to solve the application placement problem in the Fog, while pursuing three main objectives. First, it offers a comprehensive overview on the currently employed algorithms, on the availability of open-source prototypes, and on the size of test use cases. Second, it classifies the literature based on the application and Fog infrastructure characteristics that are captured by available models, with a focus on the considered constraints and the optimised metrics. Finally, it identifies some open challenges in application placement in the Fog

    EDGE-CoT: next generation cloud computing and its impact on business

    Get PDF
    Purpose – The main objective of this paper is to analyze the potential impact of future cloud computing trends on business, from the perspective of specialists in the area. Design/ methodology/ approach - Qualitative approach that includes literature review and nine semi-structured interviews with proclaimed influencers and global thought leaders in cloud computing, highlighting Jeff Barr, Vice President of Amazon Web Services. Findings -5G networks will enable the emergence of the Edge-CoT architecture, that will consequently drive the increased application of Artificial Intelligence/ Machine Learning (AI/ML) and Robotics. The combination of Edge-CoT, Robotics and AI/ML triggers the development of Smart Cities and Industry 4.0. Simultaneously, Cloud alone will benefit of increased connectivity and will be the preferred business architecture comparing to EdgeCoT. New industries and businesses will result from the Edge-CoT, and the existing companies will benefit mainly from an improved customer experience. Major business challenges triggered by Edge-CoT include workforce re-skilling, promotion of the agile approach and a cultural shift towards risk-taking. Research limitations/implications - The research study was limited to the analysis of a selected set of cloud computing trends. Moreover, the data collection process was limited to 9 cloud experts, hindering a possible generalization. Originality/value – This study uses a qualitative approach to listen to market experts and cross with the theoretical findings to date, consequently bringing theory and practice closer together.Objetivo - O objetivo deste estudo consiste em analisar o potencial impacto das tendências futuras de cloud computing na gestão das empresas, a partir da visão de especialistas da área. Metodologia- Abordagem qualitativa que engloba revisão de literatura e nove entrevistas semiestruturadas com proclamados influencers e lideres globais em cloud computing, destacando-se Jeff Barr, o Vice-presidente da Amazon Web Services. Resultado - As redes 5G possibilitarão o surgimento da arquitetura Edge-CoT, que consequentemente impulsionará o aumento da aplicação de Inteligência Artificial (AI) e robótica. A combinação de Edge-CoT, Robótica e AI desencadeia o desenvolvimento de Smart Cities e Industry 4.0. Simultaneamente, a Cloud sozinha beneficiará do aumento da conectividade e será a arquitetura preferida comparativamente a Edge-CoT. Novos setores e negócios resultarão do Edge-CoT, e as empresas existentes beneficiarão principalmente de uma melhor experiência do cliente. Os principais desafios organizacionais desencadeados pelo Edge-CoT incluem a requalificação da força de trabalho, a adoção da abordagem agile e uma mudança cultural que estimule experimentos tecnológicos. Restrição da pesquisa - O processo de recolha de dados foi limitado a 9 especialistas em cloud computing, dificultando assim uma possível generalização. Originalidade/ Valor - Este estudo utiliza uma abordagem qualitativa para ouvir os especialistas do mercado e cruzar com os resultados teóricos até o momento, aproximando assim a teoria da prática

    epcAware: a game-based, energy, performance and cost efficient resource management technique for multi-access edge computing

    Get PDF
    The Internet of Things (IoT) is producing an extraordinary volume of data daily, and it is possible that the data may become useless while on its way to the cloud for analysis, due to longer distances and delays. Fog/edge computing is a new model for analyzing and acting on time-sensitive data (real-time applications) at the network edge, adjacent to where it is produced. The model sends only selected data to the cloud for analysis and long-term storage. Furthermore, cloud services provided by large companies such as Google, can also be localized to minimize the response time and increase service agility. This could be accomplished through deploying small-scale datacenters (reffered to by name as cloudlets) where essential, closer to customers (IoT devices) and connected to a centrealised cloud through networks - which form a multi-access edge cloud (MEC). The MEC setup involves three different parties, i.e. service providers (IaaS), application providers (SaaS), network providers (NaaS); which might have different goals, therefore, making resource management a defficult job. In the literature, various resource management techniques have been suggested in the context of what kind of services should they host and how the available resources should be allocated to customers’ applications, particularly, if mobility is involved. However, the existing literature considers the resource management problem with respect to a single party. In this paper, we assume resource management with respect to all three parties i.e. IaaS, SaaS, NaaS; and suggest a game theoritic resource management technique that minimises infrastructure energy consumption and costs while ensuring applications performance. Our empirical evaluation, using real workload traces from Google’s cluster, suggests that our approach could reduce up to 11.95% energy consumption, and approximately 17.86% user costs with negligible loss in performance. Moreover, IaaS can reduce up to 20.27% energy bills and NaaS can increase their costs savings up to 18.52% as compared to other methods

    Do we all really know what a fog node is? Current trends towards an open definition

    Get PDF
    Fog computing has emerged as a promising technology that can bring cloud applications closer to the physical IoT devices at the network edge. While it is widely known what cloud computing is, how data centers can build the cloud infrastructure and how applications can make use of this infrastructure, there is no common picture on what fog computing and particularly a fog node, as its main building block, really is. One of the first attempts to define a fog node was made by Cisco, qualifying a fog computing system as a “mini-cloud” located at the edge of the network and implemented through a variety of edge devices, interconnected by a variety, mostly wireless, communication technologies. Thus, a fog node would be the infrastructure implementing the said mini-cloud. Other proposals have their own definition of what a fog node is, usually in relation to a specific edge device, a specific use case or an application. In this paper, we first survey the state of the art in technologies for fog computing nodes, paying special attention to the contributions that analyze the role edge devices play in the fog node definition. We summarize and compare the concepts, lessons learned from their implementation, and end up showing how a conceptual framework is emerging towards a unifying fog node definition. We focus on core functionalities of a fog node as well as in the accompanying opportunities and challenges towards their practical realization in the near future.Postprint (author's final draft

    Exploring the Challenges of a Flexible, Feature Rich IoT Testbed

    Get PDF
    IoT is a field of technology of ever growing importance in our daily lives. From smart cities, health devices, climate observations, appliances, and so much more, IoT surrounds us now more than ever. The types of devices being added to IoT networks is ever growing, and as this variety of hardware and software increases, so does the difficulty of working with them. Ensuring inter-compatibility between devices, testing new communication protocols, and writing software for emerging technologies becomes a complex challenge. To help solve this challenge are IoT Testbeds. IoT Testbeds help developers, researchers, and many more groups of people explore and test their IoT solutions in contexts of real IoT Devices. These testbeds exist today, but as far as we know, no Jack of all trades testbed exists that supports all features one might want from a testbed. This thesis will introduce a first draft of a new testbed. Introducing a system design, architecture, and implementation that theoretically and practically implements all these features. Also highlighting issues with this design and ways to tackle them. In the end contributing a foundation onto which a powerful system could be built. The challenge the thesis aims to tackle is, in short: What are the needed features that make up a good testbed? And how can we incorporate these features into a simple, flexible, unified system

    Performance and efficiency optimization of multi-layer IoT edge architecture

    Get PDF
    Abstract. Internet of Things (IoT) has become a backbone technology that connects together various devices with diverse capabilities. It is a technology, which enables ubiquitously available digital services for end-users. IoT applications for mission-critical scenarios need strict performance indicators such as of latency, scalability, security and privacy. To fulfil these requirements, IoT also requires support from relevant enabling technologies, such as cloud, edge, virtualization and fifth generation mobile communication (5G) technologies. For Latency-critical applications and services, long routes between the traditional cloud server and end-devices (sensors /actuators) is not a feasible approach for computing at these data centres, although these traditional clouds provide very high computational and storage for current IoT system. MEC model can be used to overcome this challenge, which brings the CC computational capacity within or next on the access network base stations. However, the capacity to perform the most critical processes at the local network layer is often necessary to cope with the access network issues. Therefore, this thesis compares the two existing IoT models such as traditional cloud-IoT model, a MEC-based edge-cloud-IoT model, with proposed local edge-cloud-IoT model with respect to their performance and efficiency, using iFogSim simulator. The results consolidate our research team’s previous findings that utilizing the three-tier edge-IoT architecture, capable of optimally utilizing the computational capacity of each of the three tiers, is an effective measure to reduce energy consumption, improve end-to-end latency and minimize operational costs in latency-critical It applications
    corecore