320 research outputs found

    Latency Minimization in Wireless IoT Using Prioritized Channel Access and Data Aggregation

    Get PDF
    Future Internet of Things (IoT) networks are expected to support a massive number of heterogeneous devices/sensors in diverse applications ranging from eHealthcare to industrial control systems. In highly-dense deployment scenarios such as industrial IoT systems, providing reliable communication links with low-latency becomes challenging due to the involved system delay including data acquisition and processing latencies at the edge-side of IoT networks. In this regard, this paper proposes a priority-based channel access and data aggregation scheme at the Cluster Head (CH) to reduce channel access and queuing delays in a clustered industrial IoT network. First, a prioritized channel access mechanism is developed by assigning different Medium Access Control (MAC) layer attributes to the packets coming from two types of IoT nodes, namely, high-priority and low-priority nodes, based on the application-specific information provided from the cloud-center. Subsequently, a preemptive M/G/1 queuing model is employed by using separate low-priority and high- priority queues before sending aggregated data to the Cloud. Our results show that the proposed priority-based method significantly improves the system latency and reliability as compared to the non-prioritized scheme

    Review of Energy Efficient Techniques of IoT

    Get PDF
    The network across which the information is sensed by the sensor devices and then forwarded to the sink is known as Internet of Things (IoT). Even though this system is deployed in several applications, there are certain issues faced in it due to its dynamic nature. The internet of things is derived from the wireless sensor networks. The sensor nodes which are deployed to sense environmental conditions are very small in size and also deployed on the far places due to which energy consumption is the major issue of internet of things. This research work related to reduce energy consumption of the network so that lifetime can be improved. In the existing system the approach of multilevel clustering is used for the data aggregation to base station. In the approach of multilevel clustering, the whole network is divided into clusters and cluster heads are selected in each cluster. The energy efficient techniques of internet of things are reviewed and analyzed in terms of certain parameters

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Context-Aware Wireless Connectivity and Processing Unit Optimization for IoT Networks

    Get PDF
    A novel approach is presented in this work for context-aware connectivity and processing optimization of Internet of things (IoT) networks. Different from the state-of-the-art approaches, the proposed approach simultaneously selects the best connectivity and processing unit (e.g., device, fog, and cloud) along with the percentage of data to be offloaded by jointly optimizing energy consumption, response-time, security, and monetary cost. The proposed scheme employs a reinforcement learning algorithm, and manages to achieve significant gains compared to deterministic solutions. In particular, the requirements of IoT devices in terms of response-time and security are taken as inputs along with the remaining battery level of the devices, and the developed algorithm returns an optimized policy. The results obtained show that only our method is able to meet the holistic multi-objective optimization criteria, albeit, the benchmark approaches may achieve better results on a particular metric at the cost of failing to reach the other targets. Thus, the proposed approach is a device-centric and context-aware solution that accounts for the monetary and battery constraints

    A critical analysis of research potential, challenges and future directives in industrial wireless sensor networks

    Get PDF
    In recent years, Industrial Wireless Sensor Networks (IWSNs) have emerged as an important research theme with applications spanning a wide range of industries including automation, monitoring, process control, feedback systems and automotive. Wide scope of IWSNs applications ranging from small production units, large oil and gas industries to nuclear fission control, enables a fast-paced research in this field. Though IWSNs offer advantages of low cost, flexibility, scalability, self-healing, easy deployment and reformation, yet they pose certain limitations on available potential and introduce challenges on multiple fronts due to their susceptibility to highly complex and uncertain industrial environments. In this paper a detailed discussion on design objectives, challenges and solutions, for IWSNs, are presented. A careful evaluation of industrial systems, deadlines and possible hazards in industrial atmosphere are discussed. The paper also presents a thorough review of the existing standards and industrial protocols and gives a critical evaluation of potential of these standards and protocols along with a detailed discussion on available hardware platforms, specific industrial energy harvesting techniques and their capabilities. The paper lists main service providers for IWSNs solutions and gives insight of future trends and research gaps in the field of IWSNs

    Wireless for Machine Learning

    Full text link
    As data generation increasingly takes place on devices without a wired connection, Machine Learning over wireless networks becomes critical. Many studies have shown that traditional wireless protocols are highly inefficient or unsustainable to support Distributed Machine Learning. This is creating the need for new wireless communication methods. In this survey, we give an exhaustive review of the state of the art wireless methods that are specifically designed to support Machine Learning services. Namely, over-the-air computation and radio resource allocation optimized for Machine Learning. In the over-the-air approach, multiple devices communicate simultaneously over the same time slot and frequency band to exploit the superposition property of wireless channels for gradient averaging over-the-air. In radio resource allocation optimized for Machine Learning, Active Learning metrics allow for data evaluation to greatly optimize the assignment of radio resources. This paper gives a comprehensive introduction to these methods, reviews the most important works, and highlights crucial open problems.Comment: Corrected typo in author name. From the incorrect Maitron to the correct Mairto

    Resource Allocation in Multi-access Edge Computing (MEC) Systems: Optimization and Machine Learning Algorithms

    Get PDF
    With the rapid proliferation of diverse wireless applications, the next generation of wireless networks are required to meet diverse quality of service (QoS) in various applications. The existing one-size-fits-all resource allocation algorithms will not be able to sustain the sheer need of supporting diverse QoS requirements. In this context, radio access network (RAN) slicing has been recently emerged as a promising approach to virtualize networks resources and create multiple logical network slices on a common physical infrastructure. Each slice can then be tailored to a specific application with distinct QoS requirement. This would considerably reduce the cost of infrastructure providers. However, efficient virtualized network slicing is only feasible if network resources are efficiently monitored and allocated. In the first part of this thesis, leveraging on tools from fractional programming and Augmented Lagrange method, I propose an efficient algorithm to jointly optimize users offloading decisions, communication, and computing resource allocation in a sliced multi-cell multi-access edge computing (MEC) network in the presence of interference. The objective is to minimize the weighted sum of the delay deviation observed at each slice from its corresponding delay requirement. The considered problem enables slice prioritization, cooperation among MEC servers, and partial offloading to multiple MEC servers. On another note, due to high computation and time complexity, traditional centralized optimization solutions are often rendered impractical and non-scalable for real-time resource allocation purposes. Thus, the need of machine learning algorithms has become more vital than ever before. To address this issue, in the second part of this thesis, exploiting the power of federated learning (FDL) and optimization theory, I develop a federated deep reinforcement learning framework for joint offloading decision and resource allocation in order to minimize the joint delay and energy consumption in a MEC-enabled internet-of-things (IoT) network with QoS constraints. The proposed algorithm is applied to an IoT network, since the IoT devices suffer significantly from limited computation and battery capacity. The proposed algorithm is distributed in nature, exploit cooperation among devices, preserves the privacy, and is executable on resource-limited cellular or IoT devices

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as enhanced Mobile Broadband (eMBB), massive Machine Type Communications (mMTC) and Ultra-Reliable and Low Latency Communications (URLLC), the mMTC brings the unique technical challenge of supporting a huge number of MTC devices in cellular networks, which is the main focus of this paper. The related challenges include Quality of Service (QoS) provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and Narrowband IoT (NB-IoT). Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenario along with the recent advances towards enhancing its learning performance and convergence. Finally, we discuss some open research challenges and promising future research directions

    Enhanced priority-based adaptive energy-aware mechanisms for wireless sensor networks

    Get PDF
    Wireless Sensor Networks (WSN) continues to find its use in our lives. However, research has shown that it has barely attained an optimal performance, particularly in the aspects of data heterogeneity, data prioritization, data routing, and energy efficiency, all of which affects its operational lifetime. The IEEE 802.15.4 protocol standard, which manages data forwarding across the Data Link Layer (DLL) does not address the impact of heterogeneous data and node Battery-Level (BL) which is an indicator for node battery life. Likewise, mechanisms proposed in the literature – TCP-CSMA/CA, QWL-RPL and SSRA have not proffered optimal solution as they encourage excessive computational overhead which results in shortened operational lifetime. These problems are inherited on the Network Layer (NL) where data routing is implemented. Mitigating these challenges, this research presents an Enhanced Priority-based Adaptive Energy-Aware Mechanisms (EPAEAM) for Wireless Sensor Networks. The first mechanism is the Optimized Backoff Mechanism for Prioritized Data (OBMPD) in Wireless Sensor Networks. This mechanism proposed the Class of Service Traffic Priority-based Medium Access Control (CSTP-MAC). The CSTP-MAC is implemented on the DLL. In this mechanism, unique backoff period expressions compute backoff periods according to the class and priority of the heterogeneous data. This approach improved network performances which enhanced network lifetime. The second mechanism is the Shortest Path Priority-Based Objective Function (SPPB-OF) for Wireless Sensor Networks. SPPB-OF is implemented across the NL. SPPB-OF implements a unique shortest path computation algorithm to generate energy-efficient shortest path between the source and destination nodes. The third mechanism is the Cross-Layer Energy-Efficient Priority-based Data Path (CL-EEPDP) for Wireless Sensor Networks. CL-EEPDP is implemented across the DLL and NL with considerations for node battery-level. A unique mathematical expression, Node Battery-Level Estimator (NBLE) is used to estimate the BL of neighbouring nodes. The knowledge of the BL together with the priority of data are used to decide an energy-efficient next-hop node. Benchmarking the EPAEAM with related mechanisms - TCP-CSMA/CA, QWL-RPL and SSRA, results show that EPAEAM achieved improved network performance with a packet delivery ratio (PDR) of 95.4%, and power-saving of 90.4%. In conclusion, the EPAEAM mechanism proved to be a viable energy-efficient solution for a multi-hop heterogeneous data WSN deployment with support for extended operational lifetime. The limitations and scope of these mechanisms are that their application is restricted to the data-link and network layers, moreover, only two classes of data are considered, that is; High Priority Data (HPD) and Low Priority Data (LPD)

    Towards Tactile Internet in Beyond 5G Era: Recent Advances, Current Issues and Future Directions

    Get PDF
    Tactile Internet (TI) is envisioned to create a paradigm shift from the content-oriented communications to steer/control-based communications by enabling real-time transmission of haptic information (i.e., touch, actuation, motion, vibration, surface texture) over Internet in addition to the conventional audiovisual and data traffics. This emerging TI technology, also considered as the next evolution phase of Internet of Things (IoT), is expected to create numerous opportunities for technology markets in a wide variety of applications ranging from teleoperation systems and Augmented/Virtual Reality (AR/VR) to automotive safety and eHealthcare towards addressing the complex problems of human society. However, the realization of TI over wireless media in the upcoming Fifth Generation (5G) and beyond networks creates various non-conventional communication challenges and stringent requirements in terms of ultra-low latency, ultra-high reliability, high data-rate connectivity, resource allocation, multiple access and quality-latency-rate tradeoff. To this end, this paper aims to provide a holistic view on wireless TI along with a thorough review of the existing state-of-the-art, to identify and analyze the involved technical issues, to highlight potential solutions and to propose future research directions. First, starting with the vision of TI and recent advances and a review of related survey/overview articles, we present a generalized framework for wireless TI in the Beyond 5G Era including a TI architecture, the main technical requirements, the key application areas and potential enabling technologies. Subsequently, we provide a comprehensive review of the existing TI works by broadly categorizing them into three main paradigms; namely, haptic communications, wireless AR/VR, and autonomous, intelligent and cooperative mobility systems. Next, potential enabling technologies across physical/Medium Access Control (MAC) and network layers are identified and discussed in detail. Also, security and privacy issues of TI applications are discussed along with some promising enablers. Finally, we present some open research challenges and recommend promising future research directions
    corecore