542 research outputs found

    A Practical Cooperative Multicell MIMO-OFDMA Network Based on Rank Coordination

    Get PDF
    An important challenge of wireless networks is to boost the cell edge performance and enable multi-stream transmissions to cell edge users. Interference mitigation techniques relying on multiple antennas and coordination among cells are nowadays heavily studied in the literature. Typical strategies in OFDMA networks include coordinated scheduling, beamforming and power control. In this paper, we propose a novel and practical type of coordination for OFDMA downlink networks relying on multiple antennas at the transmitter and the receiver. The transmission ranks, i.e.\ the number of transmitted streams, and the user scheduling in all cells are jointly optimized in order to maximize a network utility function accounting for fairness among users. A distributed coordinated scheduler motivated by an interference pricing mechanism and relying on a master-slave architecture is introduced. The proposed scheme is operated based on the user report of a recommended rank for the interfering cells accounting for the receiver interference suppression capability. It incurs a very low feedback and backhaul overhead and enables efficient link adaptation. It is moreover robust to channel measurement errors and applicable to both open-loop and closed-loop MIMO operations. A 20% cell edge performance gain over uncoordinated LTE-A system is shown through system level simulations.Comment: IEEE Transactions or Wireless Communications, Accepted for Publicatio

    Frame Structure Design and Analysis for Millimeter Wave Cellular Systems

    Full text link
    The millimeter-wave (mmWave) frequencies have attracted considerable attention for fifth generation (5G) cellular communication as they offer orders of magnitude greater bandwidth than current cellular systems. However, the medium access control (MAC) layer may need to be significantly redesigned to support the highly directional transmissions, ultra-low latencies and high peak rates expected in mmWave communication. To address these challenges, we present a novel mmWave MAC layer frame structure with a number of enhancements including flexible, highly granular transmission times, dynamic control signal locations, extended messaging and ability to efficiently multiplex directional control signals. Analytic formulae are derived for the utilization and control overhead as a function of control periodicity, number of users, traffic statistics, signal-to-noise ratio and antenna gains. Importantly, the analysis can incorporate various front-end MIMO capability assumptions -- a critical feature of mmWave. Under realistic system and traffic assumptions, the analysis reveals that the proposed flexible frame structure design offers significant benefits over designs with fixed frame structures similar to current 4G long-term evolution (LTE). It is also shown that fully digital beamforming architectures offer significantly lower overhead compared to analog and hybrid beamforming under equivalent power budgets.Comment: Submitted to IEEE Transactions for Wireless Communication

    Resource Allocation Frameworks for Network-coded Layered Multimedia Multicast Services

    Get PDF
    The explosive growth of content-on-the-move, such as video streaming to mobile devices, has propelled research on multimedia broadcast and multicast schemes. Multi-rate transmission strategies have been proposed as a means of delivering layered services to users experiencing different downlink channel conditions. In this paper, we consider Point-to-Multipoint layered service delivery across a generic cellular system and improve it by applying different random linear network coding approaches. We derive packet error probability expressions and use them as performance metrics in the formulation of resource allocation frameworks. The aim of these frameworks is both the optimization of the transmission scheme and the minimization of the number of broadcast packets on each downlink channel, while offering service guarantees to a predetermined fraction of users. As a case of study, our proposed frameworks are then adapted to the LTE-A standard and the eMBMS technology. We focus on the delivery of a video service based on the H.264/SVC standard and demonstrate the advantages of layered network coding over multi-rate transmission. Furthermore, we establish that the choice of both the network coding technique and resource allocation method play a critical role on the network footprint, and the quality of each received video layer.Comment: IEEE Journal on Selected Areas in Communications - Special Issue on Fundamental Approaches to Network Coding in Wireless Communication Systems. To appea

    Device-to-Device Communication and Multihop Transmission for Future Cellular Networks

    Get PDF
    The next generation wireless networks i.e. 5G aim to provide multi-Gbps data traffic, in order to satisfy the increasing demand for high-definition video, among other high data rate services, as well as the exponential growth in mobile subscribers. To achieve this dramatic increase in data rates, current research is focused on improving the capacity of current 4G network standards, based on Long Term Evolution (LTE), before radical changes are exploited which could include acquiring additional/new spectrum. The LTE network has a reuse factor of one; hence neighbouring cells/sectors use the same spectrum, therefore making the cell edge users vulnerable to inter-cell interference. In addition, wireless transmission is commonly hindered by fading and pathloss. In this direction, this thesis focuses on improving the performance of cell edge users in LTE and LTE-Advanced (LTE-A) networks by initially implementing a new Coordinated Multi-Point (CoMP) algorithm to mitigate cell edge user interference. Subsequently Device-to-Device (D2D) communication is investigated as the enabling technology for maximising Resource Block (RB) utilisation in current 4G and emerging 5G networks. It is demonstrated that the application, as an extension to the above, of novel power control algorithms, to reduce the required D2D TX power, and multihop transmission for relaying D2D traffic, can further enhance network performance. To be able to develop the aforementioned technologies and evaluate the performance of new algorithms in emerging network scenarios, a beyond-the-state-of-the-art LTE system-level simulator (SLS) was implemented. The new simulator includes Multiple-Input Multiple-Output (MIMO) antenna functionalities, comprehensive channel models (such as Wireless World initiative New Radio II i.e. WINNER II) and adaptive modulation and coding schemes to accurately emulate the LTE and LTE-A network standards. Additionally, a novel interference modelling scheme using the ‘wrap around’ technique was proposed and implemented that maintained the topology of flat surfaced maps, allowing for use with cell planning tools while obtaining accurate and timely results in the SLS compared to the few existing platforms. For the proposed CoMP algorithm, the adaptive beamforming technique was employed to reduce interference on the cell edge UEs by applying Coordinated Scheduling (CoSH) between cooperating cells. Simulation results show up to 2-fold improvement in terms of throughput, and also shows SINR gain for the cell edge UEs in the cooperating cells. Furthermore, D2D communication underlaying the LTE network (and future generation of wireless networks) was investigated. The technology exploits the proximity of users in a network to achieve higher data rates with maximum RB utilisation (as the technology reuses the cellular RB simultaneously), while taking some load off the Evolved Node B (eNB) i.e. by direct communication between User Equipment (UE). Simulation results show that the proximity and transmission power of D2D transmission yields high performance gains for a D2D receiver, which was demonstrated to be better than that of cellular UEs with better channel conditions or in close proximity to the eNB in the network. The impact of interference from the simultaneous transmission however impedes the achievable data rates of cellular UEs in the network, especially at the cell edge. Thus, a power control algorithm was proposed to mitigate the impact of interference in the hybrid network (network consisting of both cellular and D2D UEs). It was implemented by setting a minimum SINR threshold so that the cellular UEs achieve a minimum performance, and equally a maximum SINR threshold to establish fairness for the D2D transmission as well. Simulation results show an increase in the cell edge throughput and notable improvement in the overall SINR distribution of UEs in the hybrid network. Additionally, multihop transmission for D2D UEs was investigated in the hybrid network: traditionally, the scheme is implemented to relay cellular traffic in a homogenous network. Contrary to most current studies where D2D UEs are employed to relay cellular traffic, the use of idle nodes to relay D2D traffic was implemented uniquely in this thesis. Simulation results show improvement in D2D receiver throughput with multihop transmission, which was significantly better than that of the same UEs performance with equivalent distance between the D2D pair when using single hop transmission

    THROUGHPUT OPTIMIZATION AND ENERGY EFFICIENCY OF THE DOWNLINK IN THE LTE SYSTEM

    Get PDF
    Nowadays, the usage of smart phones is very popular. More and more people access the Internet with their smart phones. This demands higher data rates from the mobile network operators. Every year the number of users and the amount of information is increasing dramatically. The wireless technology should ensure high data rates to be able to compete with the wire-based technology. The main advantage of the wireless system is the ability for user to be mobile. The 4G LTE system made it possible to gain very high peak data rates. The purpose of this thesis was to investigate the improvement of the system performance for the downlink based on different antenna configurations and different scheduling algorithms. Moreover, the fairness between the users using different schedulers has been analyzed and evaluated. Furthermore, the energy efficiency of the scheduling algorithms in the downlink of LTE systems has been considered. Some important parts of the LTE system are described in the theoretical part of this thesis.fi=Opinnäytetyö kokotekstinä PDF-muodossa.|en=Thesis fulltext in PDF format.|sv=Lärdomsprov tillgängligt som fulltext i PDF-format

    Analysis of MAC-level throughput in LTE systems with link rate adaptation and HARQ protocols

    Get PDF
    LTE is rapidly gaining momentum for building future 4G cellular systems, and real operational networks are under deployment worldwide. To achieve high throughput performance, in addition to an advanced physical layer design LTE exploits a combination of sophisticated mechanisms at the radio resource management layer. Clearly, this makes difficult to develop analytical tools to accurately assess and optimise the user perceived throughput under realistic channel assumptions. Thus, most existing studies focus only on link-layer throughput or consider individual mechanisms in isolation. The main contribution of this paper is a unified modelling framework of the MAC-level downlink throughput of a sigle LTE cell, which caters for wideband CQI feedback schemes, AMC and HARQ protocols as defined in the LTE standard. We have validated the accuracy of the proposed model through detailed LTE simulations carried out with the ns-3 simulator extended with the LENA module for LTE

    Energy Management in LTE Networks

    Get PDF
    Wireless cellular networks have seen dramatic growth in number of mobile users. As a result, data requirements, and hence the base-station power consumption has increased significantly. It in turn adds to the operational expenditures and also causes global warming. The base station power consumption in long-term evolution (LTE) has, therefore, become a major challenge for vendors to stay green and profitable in competitive cellular industry. It necessitates novel methods to devise energy efficient communication in LTE. Importance of the topic has attracted huge research interests worldwide. Energy saving (ES) approaches proposed in the literature can be broadly classified in categories of energy efficient resource allocation, load balancing, carrier aggregation, and bandwidth expansion. Each of these methods has its own pros and cons leading to a tradeoff between ES and other performance metrics resulting into open research questions. This paper discusses various ES techniques for the LTE systems and critically analyses their usability through a comprehensive comparative study
    corecore