6 research outputs found

    On the optimal user grouping in NOMA system technology

    Get PDF
    This paper provides a state-of-art analysis of the most relevant studies on optimal user-aggregation strategies for non-orthogonal multiple access (NOMA) technology. The main ideas behind are i) to highlight how, in addition to the adoption of an optimal power allocation scheme, an optimal user-aggregation strategy represents an important key factor for improving NOMA system performance, and ii) to provide an exhaustive survey of the most relevant studies which can serve as useful starting point for the definition of new channel state-aware user-aggregation strategies for NOMA systems which, at the time of writing, represents a research field that still remains to be investigated more in depth. A detailed and complete analysis, which permits to point out the need to guarantee a certain relationship between users’ channel gain, is provided for each cited work

    A TLBO-BASED ENERGY EFFICIENT BASE STATION SWITCH OFF AND USER SUBCARRIER ALLOCATION ALGORITHM FOR OFDMA CELLULAR NETWORKS

    Get PDF
    Downlink of a cellular network with orthogonal frequency-division multiple access (OFDMA) is considered. Joint base station switch OFF and user subcarrier-allocation with guaranteed user quality of service, is shown to be a promising approach for reducing network’s total power consumption. However, solving the aforementioned mix-integer and nonlinear optimization problem requires robust and powerful optimization techniques. In this paper, teaching-learning based optimization algorithm has been adopted to lower cellular network’s total power consumption. The results show that the proposed technique is able to reduce network’s total power consumption by determining a near optimum set of base stations to be switched OFF and near optimum subcarrier-user assignments. It is shown that the proposed scheme is superior to existing base station switch OFF schemes. Robustness of the proposed TLBO-based technique is verified

    Evolutionary-algorithm-assisted joint channel estimation and turbo multiuser detection/decoding for OFDM/SDMA

    No full text
    The development of evolutionary algorithms (EAs), such as genetic algorithms (GAs), repeated weighted boosting search (RWBS), particle swarm optimization (PSO), and differential evolution algorithms (DEAs), have stimulated wide interests in the communication research community. However, the quantitative performance-versus-complexity comparison of GA, RWBS, PSO, and DEA techniques applied to the joint channel estimation (CE) and turbo multiuser detection (MUD)/decoding in the context of orthogonal frequency-division multiplexing/space-division multiple-access systems is a challenging problem, which has to consider both the CE problem formulated over a continuous search space and the MUD optimization problem defined over a discrete search space. We investigate the capability of the GA, RWBS, PSO, and DEA to achieve optimal solutions at an affordable complexity in this challenging application. Our study demonstrates that the EA-assisted joint CE and turbo MUD/decoder is capable of approaching both the Cramér–Rao lower bound of the optimal CE and the bit error ratio (BER) performance of the idealized optimal maximum-likelihood (ML) turbo MUD/decoder associated with perfect channel state information, respectively, despite imposing only a fraction of the idealized turbo ML-MUD/decoder’s complexity

    Energy efficient planning and operation models for wireless cellular networks

    Get PDF
    Prospective demands of next-generation wireless networks are ambitious and will require cellular networks to support 1000 times higher data rates and 10 times lower round-trip latency. While this data deluge is a natural outcome of the increasing number of mobile devices with data hungry applications and the internet of things (IoT), the low latency demand is required by the future interactive applications such as tactile internet , virtual and enhanced reality, and online internet gaming, etc. The motivation behind this thesis is to meet the increasing quality of service (QoS) demands in wireless communications and reduce the global carbon footprint at the same time. To achieve these goals, energy efficient planning and operations models for wireless cellular networks are proposed and analyzed. Firstly, a solution based on the overlay cognitive radio (CR) along with cooperative relaying is proposed to reduce the effect of the scarcity problem of the radio spectrum. In overlay technique, the primary users (PUs) cooperate with cognitive users (CUs) for mutual benefits. The achievable cognitive rate of two-way relaying (TWR) system assisted by multiple antennas is proposed. Compared to traditional relaying where the transmission to exchange two different messages between two sources takes place in four time slots, using TWR, the required number of transmission slots reduces to two slots. In the first slot, both sources transmit their signals simultaneously to the relay. Then, during the second slot the relay broadcasts its signal to the sources. Using an overlay CR technique, the CUs are allowed to allocate part of the PUs\u27 spectrum to perform their cognitive transmission. In return, acting as amplify-and-forward (AF) TWR, the CUs are exploited to support PUs to reach their target data rates over the remaining bandwidth. A meta-heuristic approach based on particle swarm optimization algorithm is proposed to find a near optimal resource allocation in addition to the relay amplification matrix gains. Then, we investigate a multiple relay selection scheme for energy harvesting (EH)-based on TWR system. All the relays are considered as EH nodes that harvest energy from renewable and radio frequency sources, where the relays forward the information to the sources. The power-splitting protocol, in which the receiver splits the input radio frequency signal into two components: one for information transmission and the other for energy harvesting, is adopted at the relay side. An approximate optimization framework based on geometric programming is established in a convex form to find near optimal PS ratios, the relays’ transmission power, and the selected relays in order to maximize the total rate utility over multiple time slots. Different utility metrics are considered and analyzed depending on the level of fairness. Secondly, a downlink resource and energy management approach for heterogeneous networks (HetNets) is proposed, where all base stations (BSs) are equipped to harvest energy from renewable energy (RE) sources. A hybrid power supply of green (renewable) and traditional micro-grid, such that the traditional micro-grid is not exploited as long as the BSs can meet their power demands from harvested and stored green energy. Furthermore, a dynamic BS switching ON/OFF combined with the EH model, where some BSs are turned off due to the low traffic periods and their stored energy in order to harvest more energy and help efficiently during the high traffic periods. A binary linear programming (BLP) optimization problem is formulated and solved optimally to minimize the network-wide energy consumption subject to users\u27 certain quality of service and BSs\u27 power consumption constraints. Moreover, green communication algorithms are implemented to solve the problem with low complexity time. Lastly, an energy management framework for cellular HetNets supported by dynamic drone small cells is proposed. A three-tier HetNet composed of a macrocell BS, micro cell BSs (MBSs), and solar powered drone small cell BSs are deployed to serve the networks\u27 subscribers. In addition to the RE, the drones can power their batteries via a charging station located at the macrocell BS site. Pre-planned locations are identified by the mobile operator for possible drones\u27 placement. The objective of this framework is to jointly determine the optimal locations of the drones in addition to the MBSs that can be safely turned off in order to minimize the daily energy consumption of the network. The framework takes also into account the cells\u27 capacities and the QoS level defined by the minimum required receiving power. A BLP problem is formulated to optimally determine the network status during a time-slotted horizon

    Multiple Access for Massive Machine Type Communications

    Get PDF
    The internet we have known thus far has been an internet of people, as it has connected people with one another. However, these connections are forecasted to occupy only a minuscule of future communications. The internet of tomorrow is indeed: the internet of things. The Internet of Things (IoT) promises to improve all aspects of life by connecting everything to everything. An enormous amount of effort is being exerted to turn these visions into a reality. Sensors and actuators will communicate and operate in an automated fashion with no or minimal human intervention. In the current literature, these sensors and actuators are referred to as machines, and the communication amongst these machines is referred to as Machine to Machine (M2M) communication or Machine-Type Communication (MTC). As IoT requires a seamless mode of communication that is available anywhere and anytime, wireless communications will be one of the key enabling technologies for IoT. In existing wireless cellular networks, users with data to transmit first need to request channel access. All access requests are processed by a central unit that in return either grants or denies the access request. Once granted access, users' data transmissions are non-overlapping and interference free. However, as the number of IoT devices is forecasted to be in the order of hundreds of millions, if not billions, in the near future, the access channels of existing cellular networks are predicted to suffer from severe congestion and, thus, incur unpredictable latencies in the system. On the other hand, in random access, users with data to transmit will access the channel in an uncoordinated and probabilistic fashion, thus, requiring little or no signalling overhead. However, this reduction in overhead is at the expense of reliability and efficiency due to the interference caused by contending users. In most existing random access schemes, packets are lost when they experience interference from other packets transmitted over the same resources. Moreover, most existing random access schemes are best-effort schemes with almost no Quality of Service (QoS) guarantees. In this thesis, we investigate the performance of different random access schemes in different settings to resolve the problem of the massive access of IoT devices with diverse QoS guarantees. First, we take a step towards re-designing existing random access protocols such that they are more practical and more efficient. For many years, researchers have adopted the collision channel model in random access schemes: a collision is the event of two or more users transmitting over the same time-frequency resources. In the event of a collision, all the involved data is lost, and users need to retransmit their information. However, in practice, data can be recovered even in the presence of interference provided that the power of the signal is sufficiently larger than the power of the noise and the power of the interference. Based on this, we re-define the event of collision as the event of the interference power exceeding a pre-determined threshold. We propose a new analytical framework to compute the probability of packet recovery failure inspired by error control codes on graph. We optimize the random access parameters based on evolution strategies. Our results show a significant improvement in performance in terms of reliability and efficiency. Next, we focus on supporting the heterogeneous IoT applications and accommodating their diverse latency and reliability requirements in a unified access scheme. We propose a multi-stage approach where each group of applications transmits in different stages with different probabilities. We propose a new analytical framework to compute the probability of packet recovery failure for each group in each stage. We also optimize the random access parameters using evolution strategies. Our results show that our proposed scheme can outperform coordinated access schemes of existing cellular networks when the number of users is very large. Finally, we investigate random non-orthogonal multiple access schemes that are known to achieve a higher spectrum efficiency and are known to support higher loads. In our proposed scheme, user detection and channel estimation are carried out via pilot sequences that are transmitted simultaneously with the user's data. Here, a collision event is defined as the event of two or more users selecting the same pilot sequence. All collisions are regarded as interference to the remaining users. We first study the distribution of the interference power and derive its expression. Then, we use this expression to derive simple yet accurate analytical bounds on the throughput and outage probability of the proposed scheme. We consider both joint decoding as well as successive interference cancellation. We show that the proposed scheme is especially useful in the case of short packet transmission
    corecore