844 research outputs found

    Towards delay-aware container-based Service Function Chaining in Fog Computing

    Get PDF
    Recently, the fifth-generation mobile network (5G) is getting significant attention. Empowered by Network Function Virtualization (NFV), 5G networks aim to support diverse services coming from different business verticals (e.g. Smart Cities, Automotive, etc). To fully leverage on NFV, services must be connected in a specific order forming a Service Function Chain (SFC). SFCs allow mobile operators to benefit from the high flexibility and low operational costs introduced by network softwarization. Additionally, Cloud computing is evolving towards a distributed paradigm called Fog Computing, which aims to provide a distributed cloud infrastructure by placing computational resources close to end-users. However, most SFC research only focuses on Multi-access Edge Computing (MEC) use cases where mobile operators aim to deploy services close to end-users. Bi-directional communication between Edges and Cloud are not considered in MEC, which in contrast is highly important in a Fog environment as in distributed anomaly detection services. Therefore, in this paper, we propose an SFC controller to optimize the placement of service chains in Fog environments, specifically tailored for Smart City use cases. Our approach has been validated on the Kubernetes platform, an open-source orchestrator for the automatic deployment of micro-services. Our SFC controller has been implemented as an extension to the scheduling features available in Kubernetes, enabling the efficient provisioning of container-based SFCs while optimizing resource allocation and reducing the end-to-end (E2E) latency. Results show that the proposed approach can lower the network latency up to 18% for the studied use case while conserving bandwidth when compared to the default scheduling mechanism

    Performance and efficiency optimization of multi-layer IoT edge architecture

    Get PDF
    Abstract. Internet of Things (IoT) has become a backbone technology that connects together various devices with diverse capabilities. It is a technology, which enables ubiquitously available digital services for end-users. IoT applications for mission-critical scenarios need strict performance indicators such as of latency, scalability, security and privacy. To fulfil these requirements, IoT also requires support from relevant enabling technologies, such as cloud, edge, virtualization and fifth generation mobile communication (5G) technologies. For Latency-critical applications and services, long routes between the traditional cloud server and end-devices (sensors /actuators) is not a feasible approach for computing at these data centres, although these traditional clouds provide very high computational and storage for current IoT system. MEC model can be used to overcome this challenge, which brings the CC computational capacity within or next on the access network base stations. However, the capacity to perform the most critical processes at the local network layer is often necessary to cope with the access network issues. Therefore, this thesis compares the two existing IoT models such as traditional cloud-IoT model, a MEC-based edge-cloud-IoT model, with proposed local edge-cloud-IoT model with respect to their performance and efficiency, using iFogSim simulator. The results consolidate our research team’s previous findings that utilizing the three-tier edge-IoT architecture, capable of optimally utilizing the computational capacity of each of the three tiers, is an effective measure to reduce energy consumption, improve end-to-end latency and minimize operational costs in latency-critical It applications

    System Abstractions for Scalable Application Development at the Edge

    Get PDF
    Recent years have witnessed an explosive growth of Internet of Things (IoT) devices, which collect or generate huge amounts of data. Given diverse device capabilities and application requirements, data processing takes place across a range of settings, from on-device to a nearby edge server/cloud and remote cloud. Consequently, edge-cloud coordination has been studied extensively from the perspectives of job placement, scheduling and joint optimization. Typical approaches focus on performance optimization for individual applications. This often requires domain knowledge of the applications, but also leads to application-specific solutions. Application development and deployment over diverse scenarios thus incur repetitive manual efforts. There are two overarching challenges to provide system-level support for application development at the edge. First, there is inherent heterogeneity at the device hardware level. The execution settings may range from a small cluster as an edge cloud to on-device inference on embedded devices, differing in hardware capability and programming environments. Further, application performance requirements vary significantly, making it even more difficult to map different applications to already heterogeneous hardware. Second, there are trends towards incorporating edge and cloud and multi-modal data. Together, these add further dimensions to the design space and increase the complexity significantly. In this thesis, we propose a novel framework to simplify application development and deployment over a continuum of edge to cloud. Our framework provides key connections between different dimensions of design considerations, corresponding to the application abstraction, data abstraction and resource management abstraction respectively. First, our framework masks hardware heterogeneity with abstract resource types through containerization, and abstracts away the application processing pipelines into generic flow graphs. Further, our framework further supports a notion of degradable computing for application scenarios at the edge that are driven by multimodal sensory input. Next, as video analytics is the killer app of edge computing, we include a generic data management service between video query systems and a video store to organize video data at the edge. We propose a video data unit abstraction based on a notion of distance between objects in the video, quantifying the semantic similarity among video data. Last, considering concurrent application execution, our framework supports multi-application offloading with device-centric control, with a userspace scheduler service that wraps over the operating system scheduler

    Enabling Artificial Intelligence Analytics on The Edge

    Get PDF
    This thesis introduces a novel distributed model for handling in real-time, edge-based video analytics. The novelty of the model relies on decoupling and distributing the services into several decomposed functions, creating virtual function chains (V F C model). The model considers both computational and communication constraints. Theoretical, simulation and experimental results have shown that the V F C model can enable the support of heavy-load services to an edge environment while improving the footprint of the service compared to state-of-the art frameworks. In detail, results on the V F C model have shown that it can reduce the total edge cost, compared with a monolithic and a simple frame distribution models. For experimenting on a real-case scenario, a testbed edge environment has been developed, where the aforementioned models, as well as a general distribution framework (Apache Spark ©), have been deployed. A cloud service has also been considered. Experiments have shown that V F C can outperform all alternative approaches, by reducing operational cost and improving the QoS. Finally, a migration model, a caching model and a QoS monitoring service based on Long-Term-Short-Term models are introduced

    EdgeAP: Enabling Edge Computing on Wireless Access Points

    Get PDF
    With the rise of the Internet of Things (IoT) leading to an explosion in the number of internet-connected devices, the current cloud computing paradigm is approaching its limits. Moving data back and forth between its origin and a far-away data center leads to issues regarding privacy, latency, and energy consumption. Edge computing, which instead processes data as close to its origin as possible, oers a promising solution to the pitfalls of cloud computing. Our proof-of-concept edge computing platform, EdgeAP, is a programmable platform for the delivery of applications on wireless access points. Use cases of the platform will be demonstrated via an example application. Additionally, the viability of edge computing on wireless access points will be thoroughly evaluated
    • …
    corecore