224 research outputs found

    Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support.

    Get PDF
    A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends

    Pre-study on Multi-access Edge Computing at Communication Technology lab: simulator/emulator

    Get PDF
    Multi-access edge computing has been on the rise since the evolution of 5G. There are challenges that 5G has been fighting with, such as latency and data being transferred. In this paper, it will talk about the background of Multi-access edge computing, the state of the art of Multi-access edge computing research and then implementing a simulator to demonstrate Multi-access edge computing functionalities

    MEC vs MCC: performance analysis of real-time applications

    Get PDF
    Hoje em dia, numerosas são as aplicações que apresentam um uso intensivo de recursos empurrando os requisitos computacionais e a demanda de energia dos dispositivos para além das suas capacidades. Atentando na arquitetura Mobile Cloud, que disponibiliza plataformas funcionais e aplicações emergentes (como Realidade Aumentada (AR), Realidade Virtual (VR), jogos online em tempo real, etc.), são evidentes estes desafios directamente relacionados com a latência, consumo de energia, e requisitos de privacidade. O Mobile Edge Computing (MEC) é uma tecnologia recente que aborda os obstáculos de desempenho enfrentados pela Mobile Cloud Computing (MCC), procurando solucioná-los O MEC aproxima as funcionalidades de computação e de armazenamento da periferia da rede. Neste trabalho descreve-se a arquitetura MEC assim como os principais tipos soluções para a sua implementação. Apresenta-se a arquitetura de referência da tecnologia cloudlet e uma comparação com o modelo de arquitetura ainda em desenvolvimento e padronização pelo ETSI. Um dos propósitos do MEC é permitir remover dos dispositivos tarefas intensivas das aplicações para melhorar a computação, a capacidade de resposta e a duração da bateria dos dispositivos móveis. O objetivo deste trabalho é estudar, comparar e avaliar o desempenho das arquiteturas MEC e MCC para o provisionamento de tarefas intensivas de aplicações com uso intenso de computação. Os cenários de teste foram configurados utilizando esse tipo de aplicações em ambas as implementações de MEC e MCC. Os resultados do teste deste estudo permitem constatar que o MEC apresenta melhor desempenho do que o MCC relativamente à latência e à qualidade de experiência do utilizador. Além disso, os resultados dos testes permitem quantificar o benefício efetivo tecnologia MEC.Numerous applications, such as Augmented Reality (AR), Virtual Reality (VR), real-time online gaming are resource-intensive applications and consequently, are pushing the computational requirements and energy demands of the mobile devices beyond their capabilities. Despite the fact that mobile cloud architecture has practical and functional platforms, these new emerging applications present several challenges regarding latency, energy consumption, context awareness, and privacy enhancement. Mobile Edge Computing (MEC) is a new resourceful and intermediary technology, that addresses the performance hurdles faced by Mobile Cloud Computing (MCC), and brings computing and storage closer to the network edge. This work introduces the MEC architecture and some of edge computing implementations. It presents the reference architecture of the cloudlet technology and provides a comparison with the architecture model that is under standardization by ETSI. MEC can offload intensive tasks from applications to enhance computation, responsiveness and battery life of the mobile devices. The objective of this work is to study and evaluate the performance of MEC and MCC architectures for provisioning offload intensive tasks from compute-intensive applications. Test scenarios were set up with use cases with this kind of applications for both MEC and MCC implementations. The test results of this study enable to support evidence that the MEC presents better performance than cloud computing regarding latency and user quality of experience. Moreover, the results of the tests enable to quantify the effective benefit of the MEC approach

    Continuous QoS-compliant Orchestration in the Cloud-Edge Continuum

    Full text link
    The problem of managing multi-service applications on top of Cloud-Edge networks in a QoS-aware manner has been thoroughly studied in recent years from a decision-making perspective. However, only a few studies addressed the problem of actively enforcing such decisions while orchestrating multi-service applications and considering infrastructure and application variations. In this article, we propose a next-gen orchestrator prototype based on Docker to achieve the continuous and QoS-compliant management of multiservice applications on top of geographically distributed Cloud-Edge resources, in continuity with CI/CD pipelines and infrastructure monitoring tools. Finally, we assess our proposal over a geographically distributed testbed across Italy.Comment: 25 pages, 8 figure

    Implementing and evaluating an ICON orchestrator

    Get PDF
    The cloud computing paradigm has risen, during the last 20 years, to the task of bringing powerful computational services to the masses. Centralizing the computer hardware to a few large data centers has brought large monetary savings, but at the cost of a greater geographical distance between the server and the client. As a new generation of thin clients have emerged, e.g. smartphones and IoT-devices, the larger latencies induced by these greater distances, can limit the applications that could benefit from using the vast resources available in cloud computing. Not long after the explosive growth of cloud computing, a new paradigm, edge computing has risen. Edge computing aims at bringing the resources generally found in cloud computing closer to the edge where many of the end-users, clients and data producers reside. In this thesis, I will present the edge computing concept as well as the technologies enabling it. Furthermore I will show a few edge computing concepts and architectures, including multi- access edge computing (MEC), Fog computing and intelligent containers (ICON). Finally, I will also present a new edge-orchestrator, the ICON Python Orchestrator (IPO), that enables intelligent containers to migrate closer to the users. The ICON Python orchestrator tests the feasibility of the ICON concept and provides per- formance measurements that can be compared to other contemporary edge computing im- plementations. In this thesis, I will present the IPO architecture design including challenges encountered during the implementation phase and solutions to specific problems. I will also show the testing and validation setup. By using the artificial testing and validation network, client migration speeds were measured using three different cases - redirection, cache hot ICON migration and cache cold ICON migration. While there is room for improvements, the migration speeds measured are on par with other edge computing implementations

    Orchestration from the cloud to the edge

    Get PDF
    The effective management of complex and heterogeneous computing environments is one of the biggest challenges that service and infrastructure providers are facing in the Cloud-to-Thing continuum era. Advanced orchestration systems are required to support the resource management of large-scale cloud data centres integrated into big data generation of IoT devices. The orchestration system should be aware of all available resources and their current status in order to perform dynamic allocations and enable short time deployment of applications. This chapter will review the state of the art with regards to orchestration along the Cloud-to-Thing continuum with a specific emphasis on container-based orchestration (e.g. Docker Swarm and Kubernetes) and fog-specific orchestration architectures (e.g. SORTS, SOAFI, ETSI IGS MEC, and CONCERT)

    Edge Computing for Extreme Reliability and Scalability

    Get PDF
    The massive number of Internet of Things (IoT) devices and their continuous data collection will lead to a rapid increase in the scale of collected data. Processing all these collected data at the central cloud server is inefficient, and even is unfeasible or unnecessary. Hence, the task of processing the data is pushed to the network edges introducing the concept of Edge Computing. Processing the information closer to the source of data (e.g., on gateways and on edge micro-servers) not only reduces the huge workload of central cloud, also decreases the latency for real-time applications by avoiding the unreliable and unpredictable network latency to communicate with the central cloud
    corecore