50 research outputs found

    Towards optimal positioning and energy-efficient UAV path scheduling in IoT applications

    No full text
    The Unmanned Aerial Vehicle (UAV) based communication has been emerged as a feasible solution for remote applications such as disaster management, and search and rescue due to its mobility and cost efficiency. The prior researches in this domain focused on positioning and path planning of UAVs; however, these approaches faced several limitations due to adverse weather conditions of the environment. In this paper, the impact of weather-based positioning and path planning of UAVs (IWPOP-UAV) is carried out to achieve increased QoS, reliability and energy efficiency in UAV communications. Initially, the prediction of weather conditions in the emergency situations is performed by utilizing the Cerebral Long Short-Term Memory (C-LSTM) which possesses negligible training loss and increased accuracy. The cell-based partitioning of emergency area is carried out in order to determine the target UEs. The decision upon number and position of the UAVs in each cell is provided by the A3C algorithm based on weather conditions and other significant factors thereby achieving increased coverage ratio and minimal requirement of UAV transmit power. The path planning of the UAV in order to perform effective collection of data is considered as a multi-objective optimization problem and executed by using Mayfly Optimization Algorithm (MOA). By doing so, the proposed approach is able to achieve increased QoS, reliability and energy efficiency in UAV based communication. The proposed IWPOP-UAV approach is experimented in NS 3.26 and evaluated in terms of performance metrics such as coverage ratio, cell coverage, delay, path gain, number of collected packets, UAV transmit power, and energy consumption. The results obtained are summarized and concluded to demonstrate the efficacy of the proposed approach

    Toward Smart Traffic Management With 3D Placement Optimization in UAV-Assisted NOMA IIoT Networks

    No full text
    Next generation networks will involve huge number of industrial internet of things (IIoT) sensors which require reliable connectivity with low latency to manage the data transmission and processing. The design of these networks entails a lot of challenges. This article describes the 3D placement of multiple unmanned aerial vehicles (UAVs) in an IIoT network that supports non-orthogonal multiple access (NOMA). UAVs act as decode and forward (DF) relays. The 3D UAV placement problem is formulated which is highly non-convex in the coordinates. Therefore, we employ an improved adaptive whale optimization algorithm (IAWOA) to handle the problem. Even with its improved performance, IAWOA is not suitable for real-time application. Hence, we propose path aggregation network (PANet) to handle the 3D UAV placement. The simulation results show that PANet is more suitable for the online-learning

    Optimizing Task Offloading Energy in Multi-User Multi-UAV-Enabled Mobile Edge-Cloud Computing Systems

    No full text
    With the emergence of various new Internet of Things (IoT) devices and the rapid increase in the number of users, enormous services and complex applications are growing rapidly. However, these services and applications are resource-intensive and data-hungry, requiring satisfactory quality-of-service (QoS) and network coverage density guarantees in sparsely populated areas, whereas the limited battery life and computing resources of IoT devices will inevitably become insufficient. Unmanned aerial vehicle (UAV)-enabled mobile edge computing (MEC) is one of the most promising solutions that ensures the stability and expansion of the network coverage area for these applications and provides them with computational capabilities. In this paper, computation offloading and resource allocation are jointly considered for multi-user multi-UAV-enabled mobile edge-cloud computing systems. First, we propose an efficient resource allocation and computation offloading model for a multi-user multi-UAV-enabled mobile edge-cloud computing system. Our proposed system is scalable and can support increases in network traffic without performance degradation. In addition, the network deploys multi-level mobile edge computing (MEC) technology to provide the computational capabilities at the edge of the radio access network (RAN). The core network is based on software-defined networking (SDN) technology to manage network traffic. Experimental results demonstrate that the proposed model can dramatically boost the system performance of the system in terms of time and energy

    Optimizing Task Offloading Energy in Multi-User Multi-UAV-Enabled Mobile Edge-Cloud Computing Systems

    No full text
    With the emergence of various new Internet of Things (IoT) devices and the rapid increase in the number of users, enormous services and complex applications are growing rapidly. However, these services and applications are resource-intensive and data-hungry, requiring satisfactory quality-of-service (QoS) and network coverage density guarantees in sparsely populated areas, whereas the limited battery life and computing resources of IoT devices will inevitably become insufficient. Unmanned aerial vehicle (UAV)-enabled mobile edge computing (MEC) is one of the most promising solutions that ensures the stability and expansion of the network coverage area for these applications and provides them with computational capabilities. In this paper, computation offloading and resource allocation are jointly considered for multi-user multi-UAV-enabled mobile edge-cloud computing systems. First, we propose an efficient resource allocation and computation offloading model for a multi-user multi-UAV-enabled mobile edge-cloud computing system. Our proposed system is scalable and can support increases in network traffic without performance degradation. In addition, the network deploys multi-level mobile edge computing (MEC) technology to provide the computational capabilities at the edge of the radio access network (RAN). The core network is based on software-defined networking (SDN) technology to manage network traffic. Experimental results demonstrate that the proposed model can dramatically boost the system performance of the system in terms of time and energy

    Performance Estimation in V2X Networks Using Deep Learning-Based M-Estimator Loss Functions in the Presence of Outliers

    No full text
    Recently, 5G networks have emerged as a new technology that can control the advancement of telecommunication networks and transportation systems. Furthermore, 5G networks provide better network performance while reducing network traffic and complexity compared to current networks. Machine-learning techniques (ML) will help symmetric IoT applications become a significant new data source in the future. Symmetry is a widely studied pattern in various research areas, especially in wireless network traffic. The study of symmetric and asymmetric faults and outliers (anomalies) in network traffic is an important topic. Nowadays, deep learning (DL) is an advanced approach in challenging wireless networks such as network management and optimization, anomaly detection, predictive analysis, lifetime value prediction, etc. However, its performance depends on the efficiency of training samples. DL is designed to work with large datasets and uses complex algorithms to train the model. The occurrence of outliers in the raw data reduces the reliability of the training models. In this paper, the performance of Vehicle-to-Everything (V2X) traffic was estimated using the DL algorithm. A set of robust statistical estimators, called M-estimators, have been proposed as robust loss functions as an alternative to the traditional MSE loss function, to improve the training process and robustize DL in the presence of outliers. We demonstrate their robustness in the presence of outliers on V2X traffic datasets

    Performance Estimation in V2X Networks Using Deep Learning-Based M-Estimator Loss Functions in the Presence of Outliers

    No full text
    Recently, 5G networks have emerged as a new technology that can control the advancement of telecommunication networks and transportation systems. Furthermore, 5G networks provide better network performance while reducing network traffic and complexity compared to current networks. Machine-learning techniques (ML) will help symmetric IoT applications become a significant new data source in the future. Symmetry is a widely studied pattern in various research areas, especially in wireless network traffic. The study of symmetric and asymmetric faults and outliers (anomalies) in network traffic is an important topic. Nowadays, deep learning (DL) is an advanced approach in challenging wireless networks such as network management and optimization, anomaly detection, predictive analysis, lifetime value prediction, etc. However, its performance depends on the efficiency of training samples. DL is designed to work with large datasets and uses complex algorithms to train the model. The occurrence of outliers in the raw data reduces the reliability of the training models. In this paper, the performance of Vehicle-to-Everything (V2X) traffic was estimated using the DL algorithm. A set of robust statistical estimators, called M-estimators, have been proposed as robust loss functions as an alternative to the traditional MSE loss function, to improve the training process and robustize DL in the presence of outliers. We demonstrate their robustness in the presence of outliers on V2X traffic datasets

    Ultra-Reliable Low-Latency Communications: Unmanned Aerial Vehicles Assisted Systems

    No full text
    Ultra-reliable low-latency communication (uRLLC) is a group of fifth-generation and sixth-generation (5G/6G) cellular applications with special requirements regarding latency, reliability, and availability. Most of the announced 5G/6G applications are uRLLC that require an end-to-end latency of milliseconds and ultra-high reliability of communicated data. Such systems face many challenges since traditional networks cannot meet such requirements. Thus, novel network structures and technologies have been introduced to enable such systems. Since uRLLC is a promising paradigm that covers many applications, this work considers reviewing the current state of the art of the uRLLC. This includes the main applications, specifications, and main requirements of ultra-reliable low-latency (uRLL) applications. The design challenges of uRLLC systems are discussed, and promising solutions are introduced. The virtual and augmented realities (VR/AR) are considered the main use case of uRLLC, and the current proposals for VR and AR are discussed. Moreover, unmanned aerial vehicles (UAVs) are introduced as enablers of uRLLC. The current research directions and the existing proposals are discussed

    Advanced optimization method for improving the urban traffic management

    No full text
    The Smart City as a concept of future cities anticipates the smart and efficient traffic management. Current situation of traffic management did not offer a sufficient solution and it is not wise to use the current technology to improve the traffic situation on the city roads. This paper deals with advanced methods for optimization, the genetic algorithms, for using in urban traffic management. Implementation of genetic algorithm and also implementation of classical static solution were provided. We try to prove the advantages of modern optimization methods, which could bring more fluent traffic to the cities and solve the current challenges as i.e. high emissions, big delay, higher probability of accidents. The paper provides comparison measurements of static and dynamic solution in discrete time, discussion of the possible implementation in praxis and evaluation of the advantages and disadvantages for both methods

    Remote networking technology for IoT: Cloud-based access for AllJoyn-enabled devices

    No full text
    The Internet of Things (IoT) represents a vision of a future communication between users, systems, and daily objects performing sensing and actuating capabilities with the goal to bring unprecedented convenience and economical benefits. Today, a wide variety of developed solutions for IoT can be seen through the all industry fields. Each of the developed systems is based on the proprietary SW implementation unable (in most cases) to share collected data with others. Trying to offer common communication platform for IoT, AllSeen Alliance introduced Alljoyn framework - interoperable platform for devices (sensors, actuators, etc.) and applications to communicate among themselves regardless of brands, transport technologies, and operating systems. In this paper, we discuss an application for remote management of light systems built as an extension of Alljoyn Framework - developed application is independent on communication technologies (e.g., ZigBee or WiFi). Besides provided communication independence, the presented framework can run on both major SoC architectures ARM and MIPS. To this end, we believe that our application (available as open source on GitHub) can serve as building block in future IoT / Smart home implementations

    Deep Learning for Predicting Traffic in V2X Networks

    No full text
    Artificial intelligence (AI) is capable of addressing the complexities and difficulties of fifth-generation (5G) mobile networks and beyond. In this paradigm, it is important to predict network metrics to meet future network requirements. Vehicle-to-everything (V2X) networks are promising wireless communication methods where traffic information exchange in an intelligent transportation system (ITS) still faces challenges, such as V2X communication congestion when many vehicles suddenly appear in an area. In this paper, a deep learning algorithm (DL) based on the unidirectional long short-term memory (LSTM) model is proposed to predict traffic in V2X networks. The prediction problems are studied in different cases depending on the number of packets sent per second. The prediction accuracy is measured in terms of root-mean-square error (RMSE), mean absolute percentage error (MAPE), and processing time
    corecore