128 research outputs found

    A Survey of Physical Layer Security Techniques for 5G Wireless Networks and Challenges Ahead

    Get PDF
    Physical layer security which safeguards data confidentiality based on the information-theoretic approaches has received significant research interest recently. The key idea behind physical layer security is to utilize the intrinsic randomness of the transmission channel to guarantee the security in physical layer. The evolution towards 5G wireless communications poses new challenges for physical layer security research. This paper provides a latest survey of the physical layer security research on various promising 5G technologies, including physical layer security coding, massive multiple-input multiple-output, millimeter wave communications, heterogeneous networks, non-orthogonal multiple access, full duplex technology, etc. Technical challenges which remain unresolved at the time of writing are summarized and the future trends of physical layer security in 5G and beyond are discussed.Comment: To appear in IEEE Journal on Selected Areas in Communication

    Optimizing resource allocation in eh-enabled internet of things

    Get PDF
    Internet of Things (IoT) aims to bridge everyday physical objects via the Internet. Traditional energy-constrained wireless devices are powered by fixed energy sources like batteries, but they may require frequent battery replacements or recharging. Wireless Energy Harvesting (EH), as a promising solution, can potentially eliminate the need of recharging or replacing the batteries. Unlike other types of green energy sources, wireless EH does not depend on nature and is thus a reliable source of energy for charging devices. Meanwhile, the rapid growth of IoT devices and wireless applications is likely to demand for more operating frequency bands. Although the frequency spectrum is currently scarce, owing to inefficient conventional regulatory policies, a considerable amount of the radio spectrum is greatly underutilized. Cognitive radio (CR) can be exploited to mitigate the spectrum scarcity problem of IoT applications by leveraging the spectrum holes. Therefore, transforming the IoT network into a cognitive based IoT network is essential to utilizing the available spectrum opportunistically. To address the two aforementioned issues, a novel model is proposed to leverage wireless EH and CR for IoT. In particular, the sum rate of users is maximized for a CR-based IoT network enabled with wireless EH. Users operate in a time switching fashion, and each time slot is partitioned into three non-overlapping parts devoted for EH, spectrum sensing and data transmission. There is a trade-off among the lengths of these three operations and thus the time slot structure is to be optimized. The general problem of joint resource allocation and EH optimization is formulated as a mixed integer nonlinear programming task which is NP-hard and intractable. Therefore, a sub-channel allocation scheme is first proposed to approximately satisfy users rate requirements and remove the integer constraints. In the second step, the general optimization problem is reduced to a convex optimization task. Another optimization framework is also designed to capture a fundamental tradeoff between energy efficiency (EE) and spectral efficiency for an EH-enabled IoT network. In particular, an EE maximization problem is formulated by taking into consideration of user buffer occupancy, data rate fairness, energy causality constraints and interference constraints. Then, a low complexity heuristic algorithm is proposed to solve the resource allocation and EE optimization problem. The proposed algorithm is shown to be capable of achieving a near optimal solution with polynomial complexity. To support Machine Type Communications (MTC) in next generation mobile networks, NarrowBand-IoT (NB-IoT) has emerged as a promising solution to provide extended coverage and low energy consumption for low cost MTC devices. However, the existing orthogonal multiple access scheme in NB-IoT cannot provide connectivity for a massive number of MTC devices. In parallel with the development of NB-IoT, Non-Orthogonal Multiple Access (NOMA), introduced for the fifth generation wireless networks, is deemed to significantly improve the network capacity by providing massive connectivity through sharing the same spectral resources. To leverage NOMA in the context of NB-IoT, a power domain NOMA scheme is proposed with user clustering for an NB-IoT system. In particular, the MTC devices are assigned to different ranks within the NOMA clusters where they transmit over the same frequency resources. Then, an optimization problem is formulated to maximize the total throughput of the network by optimizing the resource allocation of MTC devices and NOMA clustering while satisfying the transmission power and quality of service requirements. Furthermore, an efficient heuristic algorithm is designed to solve the proposed optimization problem by jointly optimizing NOMA clustering and resource allocation of MTC devices

    AI-Based Q-Learning Approach for Performance Optimization in MIMO-NOMA Wireless Communication Systems

    Get PDF
    In this paper, we investigate the performance enhancement of Multiple Input, Multiple Output, and Non-Orthogonal Multiple Access (MIMO-NOMA) wireless communication systems using an Artificial Intelligence (AI) based Q-Learning reinforcement learning approach. The primary challenge addressed is the optimization of power allocation in a MIMO-NOMA system, a complex task given the non-convex nature of the problem. Our proposed Q-Learning approach adaptively adjusts power allocation strategy for proximal and distant users, optimizing the trade-off between various conflicting metrics and significantly improving the system’s performance. Compared to traditional power allocation strategies, our approach showed superior performance across three principal parameters: spectral efficiency, achievable sum rate, and energy efficiency. Specifically, our methodology achieved approximately a 140% increase in the achievable sum rate and about 93% improvement in energy efficiency at a transmitted power of 20 dB while also enhancing spectral efficiency by approximately 88.6% at 30 dB transmitted Power. These results underscore the potential of reinforcement learning techniques, particularly Q-Learning, as practical solutions for complex optimization problems in wireless communication systems. Future research may investigate the inclusion of enhanced channel simulations and network limitations into the machine learning framework to assess the feasibility and resilience of such intelligent approaches

    A Comprehensive Overview on 5G-and-Beyond Networks with UAVs: From Communications to Sensing and Intelligence

    Full text link
    Due to the advancements in cellular technologies and the dense deployment of cellular infrastructure, integrating unmanned aerial vehicles (UAVs) into the fifth-generation (5G) and beyond cellular networks is a promising solution to achieve safe UAV operation as well as enabling diversified applications with mission-specific payload data delivery. In particular, 5G networks need to support three typical usage scenarios, namely, enhanced mobile broadband (eMBB), ultra-reliable low-latency communications (URLLC), and massive machine-type communications (mMTC). On the one hand, UAVs can be leveraged as cost-effective aerial platforms to provide ground users with enhanced communication services by exploiting their high cruising altitude and controllable maneuverability in three-dimensional (3D) space. On the other hand, providing such communication services simultaneously for both UAV and ground users poses new challenges due to the need for ubiquitous 3D signal coverage as well as the strong air-ground network interference. Besides the requirement of high-performance wireless communications, the ability to support effective and efficient sensing as well as network intelligence is also essential for 5G-and-beyond 3D heterogeneous wireless networks with coexisting aerial and ground users. In this paper, we provide a comprehensive overview of the latest research efforts on integrating UAVs into cellular networks, with an emphasis on how to exploit advanced techniques (e.g., intelligent reflecting surface, short packet transmission, energy harvesting, joint communication and radar sensing, and edge intelligence) to meet the diversified service requirements of next-generation wireless systems. Moreover, we highlight important directions for further investigation in future work.Comment: Accepted by IEEE JSA

    Multiple Access for Massive Machine Type Communications

    Get PDF
    The internet we have known thus far has been an internet of people, as it has connected people with one another. However, these connections are forecasted to occupy only a minuscule of future communications. The internet of tomorrow is indeed: the internet of things. The Internet of Things (IoT) promises to improve all aspects of life by connecting everything to everything. An enormous amount of effort is being exerted to turn these visions into a reality. Sensors and actuators will communicate and operate in an automated fashion with no or minimal human intervention. In the current literature, these sensors and actuators are referred to as machines, and the communication amongst these machines is referred to as Machine to Machine (M2M) communication or Machine-Type Communication (MTC). As IoT requires a seamless mode of communication that is available anywhere and anytime, wireless communications will be one of the key enabling technologies for IoT. In existing wireless cellular networks, users with data to transmit first need to request channel access. All access requests are processed by a central unit that in return either grants or denies the access request. Once granted access, users' data transmissions are non-overlapping and interference free. However, as the number of IoT devices is forecasted to be in the order of hundreds of millions, if not billions, in the near future, the access channels of existing cellular networks are predicted to suffer from severe congestion and, thus, incur unpredictable latencies in the system. On the other hand, in random access, users with data to transmit will access the channel in an uncoordinated and probabilistic fashion, thus, requiring little or no signalling overhead. However, this reduction in overhead is at the expense of reliability and efficiency due to the interference caused by contending users. In most existing random access schemes, packets are lost when they experience interference from other packets transmitted over the same resources. Moreover, most existing random access schemes are best-effort schemes with almost no Quality of Service (QoS) guarantees. In this thesis, we investigate the performance of different random access schemes in different settings to resolve the problem of the massive access of IoT devices with diverse QoS guarantees. First, we take a step towards re-designing existing random access protocols such that they are more practical and more efficient. For many years, researchers have adopted the collision channel model in random access schemes: a collision is the event of two or more users transmitting over the same time-frequency resources. In the event of a collision, all the involved data is lost, and users need to retransmit their information. However, in practice, data can be recovered even in the presence of interference provided that the power of the signal is sufficiently larger than the power of the noise and the power of the interference. Based on this, we re-define the event of collision as the event of the interference power exceeding a pre-determined threshold. We propose a new analytical framework to compute the probability of packet recovery failure inspired by error control codes on graph. We optimize the random access parameters based on evolution strategies. Our results show a significant improvement in performance in terms of reliability and efficiency. Next, we focus on supporting the heterogeneous IoT applications and accommodating their diverse latency and reliability requirements in a unified access scheme. We propose a multi-stage approach where each group of applications transmits in different stages with different probabilities. We propose a new analytical framework to compute the probability of packet recovery failure for each group in each stage. We also optimize the random access parameters using evolution strategies. Our results show that our proposed scheme can outperform coordinated access schemes of existing cellular networks when the number of users is very large. Finally, we investigate random non-orthogonal multiple access schemes that are known to achieve a higher spectrum efficiency and are known to support higher loads. In our proposed scheme, user detection and channel estimation are carried out via pilot sequences that are transmitted simultaneously with the user's data. Here, a collision event is defined as the event of two or more users selecting the same pilot sequence. All collisions are regarded as interference to the remaining users. We first study the distribution of the interference power and derive its expression. Then, we use this expression to derive simple yet accurate analytical bounds on the throughput and outage probability of the proposed scheme. We consider both joint decoding as well as successive interference cancellation. We show that the proposed scheme is especially useful in the case of short packet transmission

    Relaying in the Internet of Things (IoT): A Survey

    Get PDF
    The deployment of relays between Internet of Things (IoT) end devices and gateways can improve link quality. In cellular-based IoT, relays have the potential to reduce base station overload. The energy expended in single-hop long-range communication can be reduced if relays listen to transmissions of end devices and forward these observations to gateways. However, incorporating relays into IoT networks faces some challenges. IoT end devices are designed primarily for uplink communication of small-sized observations toward the network; hence, opportunistically using end devices as relays needs a redesign of both the medium access control (MAC) layer protocol of such end devices and possible addition of new communication interfaces. Additionally, the wake-up time of IoT end devices needs to be synchronized with that of the relays. For cellular-based IoT, the possibility of using infrastructure relays exists, and noncellular IoT networks can leverage the presence of mobile devices for relaying, for example, in remote healthcare. However, the latter presents problems of incentivizing relay participation and managing the mobility of relays. Furthermore, although relays can increase the lifetime of IoT networks, deploying relays implies the need for additional batteries to power them. This can erode the energy efficiency gain that relays offer. Therefore, designing relay-assisted IoT networks that provide acceptable trade-offs is key, and this goes beyond adding an extra transmit RF chain to a relay-enabled IoT end device. There has been increasing research interest in IoT relaying, as demonstrated in the available literature. Works that consider these issues are surveyed in this paper to provide insight into the state of the art, provide design insights for network designers and motivate future research directions
    • …
    corecore