85 research outputs found

    A survey of multi-access edge computing in 5G and beyond : fundamentals, technology integration, and state-of-the-art

    Get PDF
    Driven by the emergence of new compute-intensive applications and the vision of the Internet of Things (IoT), it is foreseen that the emerging 5G network will face an unprecedented increase in traffic volume and computation demands. However, end users mostly have limited storage capacities and finite processing capabilities, thus how to run compute-intensive applications on resource-constrained users has recently become a natural concern. Mobile edge computing (MEC), a key technology in the emerging fifth generation (5G) network, can optimize mobile resources by hosting compute-intensive applications, process large data before sending to the cloud, provide the cloud-computing capabilities within the radio access network (RAN) in close proximity to mobile users, and offer context-aware services with the help of RAN information. Therefore, MEC enables a wide variety of applications, where the real-time response is strictly required, e.g., driverless vehicles, augmented reality, robotics, and immerse media. Indeed, the paradigm shift from 4G to 5G could become a reality with the advent of new technological concepts. The successful realization of MEC in the 5G network is still in its infancy and demands for constant efforts from both academic and industry communities. In this survey, we first provide a holistic overview of MEC technology and its potential use cases and applications. Then, we outline up-to-date researches on the integration of MEC with the new technologies that will be deployed in 5G and beyond. We also summarize testbeds and experimental evaluations, and open source activities, for edge computing. We further summarize lessons learned from state-of-the-art research works as well as discuss challenges and potential future directions for MEC research

    Relaying in the Internet of Things (IoT): A Survey

    Get PDF
    The deployment of relays between Internet of Things (IoT) end devices and gateways can improve link quality. In cellular-based IoT, relays have the potential to reduce base station overload. The energy expended in single-hop long-range communication can be reduced if relays listen to transmissions of end devices and forward these observations to gateways. However, incorporating relays into IoT networks faces some challenges. IoT end devices are designed primarily for uplink communication of small-sized observations toward the network; hence, opportunistically using end devices as relays needs a redesign of both the medium access control (MAC) layer protocol of such end devices and possible addition of new communication interfaces. Additionally, the wake-up time of IoT end devices needs to be synchronized with that of the relays. For cellular-based IoT, the possibility of using infrastructure relays exists, and noncellular IoT networks can leverage the presence of mobile devices for relaying, for example, in remote healthcare. However, the latter presents problems of incentivizing relay participation and managing the mobility of relays. Furthermore, although relays can increase the lifetime of IoT networks, deploying relays implies the need for additional batteries to power them. This can erode the energy efficiency gain that relays offer. Therefore, designing relay-assisted IoT networks that provide acceptable trade-offs is key, and this goes beyond adding an extra transmit RF chain to a relay-enabled IoT end device. There has been increasing research interest in IoT relaying, as demonstrated in the available literature. Works that consider these issues are surveyed in this paper to provide insight into the state of the art, provide design insights for network designers and motivate future research directions

    Leveraging Resources on Anonymous Mobile Edge Nodes

    Get PDF
    Smart devices have become an essential component in the life of mankind. The quick rise of smartphones, IoTs, and wearable devices enabled applications that were not possible few years ago, e.g., health monitoring and online banking. Meanwhile, smart sensing laid the infrastructure for smart homes and smart cities. The intrusive nature of smart devices granted access to huge amounts of raw data. Researchers seized the moment with complex algorithms and data models to process the data over the cloud and extract as much information as possible. However, the pace and amount of data generation, in addition to, networking protocols transmitting data to cloud servers failed short in touching more than 20% of what was generated on the edge of the network. On the other hand, smart devices carry a large set of resources, e.g., CPU, memory, and camera, that sit idle most of the time. Studies showed that for plenty of the time resources are either idle, e.g., sleeping and eating, or underutilized, e.g. inertial sensors during phone calls. These findings articulate a problem in processing large data sets, while having idle resources in the close proximity. In this dissertation, we propose harvesting underutilized edge resources then use them in processing the huge data generated, and currently wasted, through applications running at the edge of the network. We propose flipping the concept of cloud computing, instead of sending massive amounts of data for processing over the cloud, we distribute lightweight applications to process data on users\u27 smart devices. We envision this approach to enhance the network\u27s bandwidth, grant access to larger datasets, provide low latency responses, and more importantly involve up-to-date user\u27s contextual information in processing. However, such benefits come with a set of challenges: How to locate suitable resources? How to match resources with data providers? How to inform resources what to do? and When? How to orchestrate applications\u27 execution on multiple devices? and How to communicate between devices on the edge? Communication between devices at the edge has different parameters in terms of device mobility, topology, and data rate. Standard protocols, e.g., Wi-Fi or Bluetooth, were not designed for edge computing, hence, does not offer a perfect match. Edge computing requires a lightweight protocol that provides quick device discovery, decent data rate, and multicasting to devices in the proximity. Bluetooth features wide acceptance within the IoT community, however, the low data rate and unicast communication limits its use on the edge. Despite being the most suitable communication protocol for edge computing and unlike other protocols, Bluetooth has a closed source code that blocks lower layer in front of all forms of research study, enhancement, and customization. Hence, we offer an open source version of Bluetooth and then customize it for edge computing applications. In this dissertation, we propose Leveraging Resources on Anonymous Mobile Edge Nodes (LAMEN), a three-tier framework where edge devices are clustered by proximities. On having an application to execute, LAMEN clusters discover and allocate resources, share application\u27s executable with resources, and estimate incentives for each participating resource. In a cluster, a single head node, i.e., mediator, is responsible for resource discovery and allocation. Mediators orchestrate cluster resources and present them as a virtually large homogeneous resource. For example, two devices each offering either a camera or a speaker are presented outside the cluster as a single device with both camera and speaker, this can be extended to any combination of resources. Then, mediator handles applications\u27 distribution within a cluster as needed. Also, we provide a communication protocol that is customizable to the edge environment and application\u27s need. Pushing lightweight applications that end devices can execute over their locally generated data have the following benefits: First, avoid sharing user data with cloud server, which is a privacy concern for many of them; Second, introduce mediators as a local cloud controller closer to the edge; Third, hide the user\u27s identity behind mediators; and Finally, enhance bandwidth utilization by keeping raw data at the edge and transmitting processed information. Our evaluation shows an optimized resource lookup and application assignment schemes. In addition to, scalability in handling networks with large number of devices. In order to overcome the communication challenges, we provide an open source communication protocol that we customize for edge computing applications, however, it can be used beyond the scope of LAMEN. Finally, we present three applications to show how LAMEN enables various application domains on the edge of the network. In summary, we propose a framework to orchestrate underutilized resources at the edge of the network towards processing data that are generated in their proximity. Using the approaches explained later in the dissertation, we show how LAMEN enhances the performance of applications and enables a new set of applications that were not feasible

    An event-aware cluster-head rotation algorithm for extending lifetime of wireless sensor Network with smart nodes

    Get PDF
    Smart sensor nodes can process data collected from sensors, make decisions, and recognize relevant events based on the sensed information before sharing it with other nodes. In wireless sensor networks, the smart sensor nodes are usually grouped in clusters for effective cooperation. One sensor node in each cluster must act as a cluster head. The cluster head depletes its energy resources faster than the other nodes. Thus, the cluster-head role must be periodically reassigned (rotated) to different sensor nodes to achieve a long lifetime of wireless sensor network. This paper introduces a method for extending the lifetime of the wireless sensor networks with smart nodes. The proposed method combines a new algorithm for rotating the cluster-head role among sensor nodes with suppression of unnecessary data transmissions. It enables effective control of the cluster-head rotation based on expected energy consumption of sensor nodes. The energy consumption is estimated using a lightweight model, which takes into account transmission probabilities. This method was implemented in a prototype of wireless sensor network. During experimental evaluation of the new method, detailed measurements of lifetime and energy consumption were conducted for a real wireless sensor network. Results of these realistic experiments have revealed that the lifetime of the sensor network is extended when using the proposed method in comparison with state-of-the-art cluster-head rotation algorithms

    A Survey on Energy-Efficient Strategies in Static Wireless Sensor Networks

    Get PDF
    A comprehensive analysis on the energy-efficient strategy in static Wireless Sensor Networks (WSNs) that are not equipped with any energy harvesting modules is conducted in this article. First, a novel generic mathematical definition of Energy Efficiency (EE) is proposed, which takes the acquisition rate of valid data, the total energy consumption, and the network lifetime of WSNs into consideration simultaneously. To the best of our knowledge, this is the first time that the EE of WSNs is mathematically defined. The energy consumption characteristics of each individual sensor node and the whole network are expounded at length. Accordingly, the concepts concerning EE, namely the Energy-Efficient Means, the Energy-Efficient Tier, and the Energy-Efficient Perspective, are proposed. Subsequently, the relevant energy-efficient strategies proposed from 2002 to 2019 are tracked and reviewed. Specifically, they respectively are classified into five categories: the Energy-Efficient Media Access Control protocol, the Mobile Node Assistance Scheme, the Energy-Efficient Clustering Scheme, the Energy-Efficient Routing Scheme, and the Compressive Sensing--based Scheme. A detailed elaboration on both of the basic principle and the evolution of them is made. Finally, further analysis on the categories is made and the related conclusion is drawn. To be specific, the interdependence among them, the relationships between each of them, and the Energy-Efficient Means, the Energy-Efficient Tier, and the Energy-Efficient Perspective are analyzed in detail. In addition, the specific applicable scenarios for each of them and the relevant statistical analysis are detailed. The proportion and the number of citations for each category are illustrated by the statistical chart. In addition, the existing opportunities and challenges facing WSNs in the context of the new computing paradigm and the feasible direction concerning EE in the future are pointed out

    A Prospective Look: Key Enabling Technologies, Applications and Open Research Topics in 6G Networks

    Get PDF
    The fifth generation (5G) mobile networks are envisaged to enable a plethora of breakthrough advancements in wireless technologies, providing support of a diverse set of services over a single platform. While the deployment of 5G systems is scaling up globally, it is time to look ahead for beyond 5G systems. This is driven by the emerging societal trends, calling for fully automated systems and intelligent services supported by extended reality and haptics communications. To accommodate the stringent requirements of their prospective applications, which are data-driven and defined by extremely low-latency, ultra-reliable, fast and seamless wireless connectivity, research initiatives are currently focusing on a progressive roadmap towards the sixth generation (6G) networks. In this article, we shed light on some of the major enabling technologies for 6G, which are expected to revolutionize the fundamental architectures of cellular networks and provide multiple homogeneous artificial intelligence-empowered services, including distributed communications, control, computing, sensing, and energy, from its core to its end nodes. Particularly, this paper aims to answer several 6G framework related questions: What are the driving forces for the development of 6G? How will the enabling technologies of 6G differ from those in 5G? What kind of applications and interactions will they support which would not be supported by 5G? We address these questions by presenting a profound study of the 6G vision and outlining five of its disruptive technologies, i.e., terahertz communications, programmable metasurfaces, drone-based communications, backscatter communications and tactile internet, as well as their potential applications. Then, by leveraging the state-of-the-art literature surveyed for each technology, we discuss their requirements, key challenges, and open research problems

    A prospective look: key enabling technologies, applications and open research topics in 6G networks

    Get PDF
    The fifth generation (5G) mobile networks are envisaged to enable a plethora of breakthrough advancements in wireless technologies, providing support of a diverse set of services over a single platform. While the deployment of 5G systems is scaling up globally, it is time to look ahead for beyond 5G systems. This is mainly driven by the emerging societal trends, calling for fully automated systems and intelligent services supported by extended reality and haptics communications. To accommodate the stringent requirements of their prospective applications, which are data-driven and defined by extremely low-latency, ultra-reliable, fast and seamless wireless connectivity, research initiatives are currently focusing on a progressive roadmap towards the sixth generation (6G) networks, which are expected to bring transformative changes to this premise. In this article, we shed light on some of the major enabling technologies for 6G, which are expected to revolutionize the fundamental architectures of cellular networks and provide multiple homogeneous artificial intelligence-empowered services, including distributed communications, control, computing, sensing, and energy, from its core to its end nodes. In particular, the present paper aims to answer several 6G framework related questions: What are the driving forces for the development of 6G? How will the enabling technologies of 6G differ from those in 5G? What kind of applications and interactions will they support which would not be supported by 5G? We address these questions by presenting a comprehensive study of the 6G vision and outlining seven of its disruptive technologies, i.e., mmWave communications, terahertz communications, optical wireless communications, programmable metasurfaces, drone-based communications, backscatter communications and tactile internet, as well as their potential applications. Then, by leveraging the state-of-the-art literature surveyed for each technology, we discuss the associated requirements, key challenges, and open research problems. These discussions are thereafter used to open up the horizon for future research directions
    corecore