46 research outputs found

    Scheduler Algorithms for MU-MIMO

    Get PDF
    In multi-user multiple input multiple output (MU-MIMO), the complexity of the base-station scheduler has increased further compared to single-user multiple input multiple output (SU-MIMO). The scheduler must understand if several users can be spatially multiplexed in the same time-frequency resource. One way to spatially separate users is through beamforming with sufficiently many antennas. In this thesis work, two downlink beamforming algorithms for MU-MIMO are studied: The first algorithm implements precoding without considering inter-cell interference (ICI). The second one considers it and attempts to mitigate or null transmissions in the direction of user equipments (UEs) in other cells. The two algorithms are evaluated in SU-MIMO and MU-MIMO setups operating in time division duplex (TDD) mode and serving with single and dual-antenna terminals. Full-Buffer (FB) and file transfer protocol (FTP) data traffic profiles are studied. Additionally, various UE mobility patterns, UE transmit antenna topologies, sounding reference signal (SRS) periodicity configurations, and uniform linear array (ULA) topologies are considered. Simulations have been performed using a system level simulation framework developed by Ericsson AB. Another important part of this thesis work is the functional verification of this simulation framework, which at the time of writing is still undergoing development. Our simulation results show that in SU-MIMO, the second algorithm, which considers ICI, outperforms the first one for FB traffic profile and all UE speeds, but not for FTP traffic profile and medium (30 km/h) or high (60 km/h) UE speeds. In this case, the first algorithm, which does not consider ICI, can be used with advantage. In MU-MIMO, cell downlink throughput gains are observed for the second algorithm over the first one for low and medium system loads (number of users). For both algorithms, the cell throughput is observed to decrease with increasing UE speed and sounding periodicity.Scheduling in modern wireless standards, e.g., 3G, 4G and future 5G, can be defined as the task of allocating time and frequency resources by the base station (BS) to each user equipment (UE) that wants to engage in communication. Resources are allocated every transmission time interval (TTI), which is typically one millisecond. There exist both uplink (from the UEs to the BS) and downlink (from the BS to the UEs) resource schedulers implemented in the e-Node B, i.e., the base station (BS) in Long Term Evolution (LTE). The aim of this thesis work is to study how various communication techniques proposed for 5G can increase the overall system throughput of the downlink (DL) when a realistic resource scheduler is used. In particular, we consider: (i) Beamforming, (ii) Multi-user multiple input multiple output (MU-MIMO), and (iii) Inter-cell interference (ICI) mitigation. Beamforming can be achieved by deploying a large number of antenna elements at the BS with the aim of increasing the signal to interference noise ratio (SINR) towards the UE. Contrary to single-user multiple input multiple output (SU-MIMO), in MU-MIMO more than one UE are scheduled for transmissions in the same time-frequency resource; this is possible by judiciously pairing various UEs which are spatially sufficiently separated (according to some metric that we will define later). ICI mitigation can be achieved by means of proper precoding at BS where the precoder attempts to mitigate the interfering signal from BS towards UEs belonging to neighboring cells. In this thesis work, we investigate the performance of two scheduler algorithms for MU-MIMO, using SU-MIMO as baseline. The first algorithm does not consider ICI while the second one does. Dual layer beamforming (that is, two independent data streams are transmitted to each UE) and time division duplex (TDD) are assumed. In TDD mode the BS acquires the channel information from sounding reference signals (SRS) transmitted in the uplink (UL) and, by virtue of channel reciprocity, reuses the so-obtained channel information in the downlink. The performance evaluation of the two algorithms is based on the following parameters: UE Traffic profile, UE speed, SRS UL antenna configuration, SRS parameters, and BS antenna topology. - UE speed includes 3,30, and 60 km/h. - UE traffic profile includes full-buffer (FB) and file transfer protocol (FTP). With FB traffic profile, UEs send/receive data to/from the BS all the time, while this is not the case in the FTP traffic profile case. Some examples of FTP traffic profiles may include chatty, video, VoIP, web, etc. - SRS UL antenna configuration includes: (i) Two SRS, in which each UE sends two SRS to the BS from two antennas, (ii) one SRS with antenna selection, in which each UE alternately sends one SRS to the BS from each of two antennas, and (iii) one SRS without antenna selection, in which each UE sends one SRS to the BS from only one antenna. For two SRS UE case (note that in the downlink two layers, and hence two UE antennas, are always used). - SRS parameters include SRS bandwidth and SRS periodicity. In this thesis work, full-bandwidth SRS (20 MHz) with various SRS periodicities such as 5 ms, 10 ms, 20 ms are considered. - BS antenna topology includes 8 and 64 antenna elements at the BS. The main result of this thesis work is that in both SU-MIMO and MU-MIMO with FB traffic profile, it is better to use the second algorithm which considers ICI rather than the first one which does not. However, with FTP traffic profile, this is not always the case

    3GPP LTE Release 9 and 10 requirement analysis to physical layer UE testing

    Get PDF
    The purpose of this thesis was to analyze the testing requirements to physical layer features which are used in LTE Release 9 and 10 timeframe. The aim of the analysis was to define test case requirements for new features from the physical layer point of view. This analysis can then be utilized to implement and design test cases using commercial eNB simulators. The analysis was carried out by studying the 3GPP specifications and by investigating the integration and system level testing requirements. Different feature specific parameters were evaluated and different testing aspects were studied in order to verify the functionalities and performance of the UE. Also, different conformance test case scenarios and field testing aspects were investigated in order to improve the test case planning in the integration and system testing phase. The analysis showed that in Rel-9 there are two main features which have a great impact on the Rel-9 physical layer testing. These two features are the dual-layer beamforming and UE positioning which is done with OTDOA and E-CID methods. It was analyzed that the requirements for the downlink dual-layer beamforming focus on TDD side and the test plan must contain especially throughput performance testing in integration and system phase testing. OTDOA and E-CID methods, on the other hand, need test plans which are concentrating on the positioning accuracy. In Rel-10, the analysis showed that there are plenty of new features on physical layer to ensure the transition from LTE to LTE-Advanced. The main requirements were assigned for the CA feature which has testing activities especially on the UE feedback operations. Also, different kinds of CA deployment scenarios were analyzed to evaluate more closely the initial CA testing scenarios in integration and system testing. Analysis continued with downlink multi-layer beamforming where the requirements were seen to concentrate on new CSI-RS aspects and to throughput performance testing. Uplink MIMO aspects were analyzed at the end and the studies showed that this feature may have a minor role in Rel-10 timeframe and therefore it does not have any important testing requirements which should be taken into account in test plans

    Interference mitigation in cognitive femtocell networks

    Get PDF
    “A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of Philosophy”.Femtocells have been introduced as a solution to poor indoor coverage in cellular communication which has hugely attracted network operators and stakeholders. However, femtocells are designed to co-exist alongside macrocells providing improved spatial frequency reuse and higher spectrum efficiency to name a few. Therefore, when deployed in the two-tier architecture with macrocells, it is necessary to mitigate the inherent co-tier and cross-tier interference. The integration of cognitive radio (CR) in femtocells introduces the ability of femtocells to dynamically adapt to varying network conditions through learning and reasoning. This research work focuses on the exploitation of cognitive radio in femtocells to mitigate the mutual interference caused in the two-tier architecture. The research work presents original contributions in mitigating interference in femtocells by introducing practical approaches which comprises a power control scheme where femtocells adaptively controls its transmit power levels to reduce the interference it causes in a network. This is especially useful since femtocells are user deployed as this seeks to mitigate interference based on their blind placement in an indoor environment. Hybrid interference mitigation schemes which combine power control and resource/scheduling are also implemented. In a joint threshold power based admittance and contention free resource allocation scheme, the mutual interference between a Femtocell Access Point (FAP) and close-by User Equipments (UE) is mitigated based on admittance. Also, a hybrid scheme where FAPs opportunistically use Resource Blocks (RB) of Macrocell User Equipments (MUE) based on its traffic load use is also employed. Simulation analysis present improvements when these schemes are applied with emphasis in Long Term Evolution (LTE) networks especially in terms of Signal to Interference plus Noise Ratio (SINR)

    A Review of MAC Scheduling Algorithms in LTE System

    Get PDF
    The recent wireless communication networks rely on the new technology named Long Term Evolution (LTE) to offer high data rate real-time (RT) traffic with better Quality of Service (QoS) for the increasing demand of customer requirement. LTE provide low latency for real-time services with high throughput, with the help of two-level packet retransmission. Hybrid Automatic Repeat Request (HARQ) retransmission at the Medium Access Control (MAC) layer of LTE networks achieves error-free data transmission. The performance of the LTE networks mainly depends on how effectively this HARQ adopted in the latest communication standard, Universal Mobile Telecommunication System (UMTS). The major challenge in LTE is to balance QoS and fairness among the users. Hence, it is very essential to design a down link scheduling scheme to get the expected service quality to the customers and to utilize the system resources efficiently. This paper provides a comprehensive literature review of LTE MAC layer and six types of QoS/Channel-aware downlink scheduling algorithms designed for this purpose. The contributions of this paper are to identify the gap of knowledge in the downlink scheduling procedure and to point out the future research direction. Based on the comparative study of algorithms taken for the review, this paper is concluded that the EXP Rule scheduler is most suited for LTE networks due to its characteristics of less Packet Loss Ratio (PLR), less Packet Delay (PD), high throughput, fairness and spectral efficiency

    Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks

    Get PDF
    Soaring capacity and coverage demands dictate that future cellular networks need to soon migrate towards ultra-dense networks. However, network densification comes with a host of challenges that include compromised energy efficiency, complex interference management, cumbersome mobility management, burdensome signaling overheads and higher backhaul costs. Interestingly, most of the problems, that beleaguer network densification, stem from legacy networks' one common feature i.e., tight coupling between the control and data planes regardless of their degree of heterogeneity and cell density. Consequently, in wake of 5G, control and data planes separation architecture (SARC) has recently been conceived as a promising paradigm that has potential to address most of aforementioned challenges. In this article, we review various proposals that have been presented in literature so far to enable SARC. More specifically, we analyze how and to what degree various SARC proposals address the four main challenges in network densification namely: energy efficiency, system level capacity maximization, interference management and mobility management. We then focus on two salient features of future cellular networks that have not yet been adapted in legacy networks at wide scale and thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and device-to-device (D2D) communications. After providing necessary background on CoMP and D2D, we analyze how SARC can particularly act as a major enabler for CoMP and D2D in context of 5G. This article thus serves as both a tutorial as well as an up to date survey on SARC, CoMP and D2D. Most importantly, the article provides an extensive outlook of challenges and opportunities that lie at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201

    A simulation study of beam management for 5G millimeter-wave cellular networks

    Get PDF
    This thesis aims at performing a system-level analysis of beam management protocol under different scenarios, mobility conditions and parameters configurations.This thesis aims at performing a system-level analysis of beam management protocol under different scenarios, mobility conditions and parameters configurations

    Efficient and Virtualized Scheduling for OFDM-Based High Mobility Wireless Communications Objects

    Get PDF
    Services providers (SPs) in the radio platform technology standard long term evolution (LTE) systems are enduring many challenges in order to accommodate the rapid expansion of mobile data usage. The modern technologies demonstrate new challenges to SPs, for example, reducing the cost of the capital and operating expenditures while supporting high data throughput per customer, extending battery life-per-charge of the cell phone devices, and supporting high mobility communications with fast and seamless handover (HO) networking architecture. In this thesis, a variety of optimized techniques aimed at providing innovative solutions for such challenges are explored. The thesis is divided into three parts. The first part outlines the benefits and challenges of deploying virtualized resource sharing concept. Wherein, SPs achieving a different schedulers policy are sharing evolved network B, allowing SPs to customize their efforts and provide service requirements; as a promising solution for reducing operational and capital expenditures, leading to potential energy savings, and supporting higher peak rates. The second part, formulates the optimized power allocation problem in a virtualized scheme in LTE uplink systems, aiming to extend the mobile devices’ battery utilization time per charge. While, the third part extrapolates a proposed hybrid-HO (HY-HO) technique, that can enhance the system performance in terms of latency and HO reliability at cell boundary for high mobility objects (up to 350 km/hr; wherein, HO will occur more frequent). The main contributions of this thesis are in designing optimal binary integer programmingbased and suboptimal heuristic (with complexity reduction) scheduling algorithms subject to exclusive and contiguous allocation, maximum transmission power, and rate constraints. Moreover, designing the HY-HO based on the combination of soft and hard HO was able to enhance the system performance in term of latency, interruption time and reliability during HO. The results prove that the proposed solutions effectively contribute in addressing the challenges caused by the demand for high data rates and power transmission in mobile networks especially in virtualized resources sharing scenarios that can support high data rates with improving quality of services (QoSs)

    Contributions to Analysis and Mitigation of Cochannel Interference in Cellular Wireless Networks

    Get PDF
    Cellular wireless networks have become a commodity. We use our cellular devices every day to connect to others, to conduct business, for entertainment. Strong demand for wireless access has made corresponding parts of radio spectrum very valuable. Consequently, network operators and their suppliers are constantly being pressured for its efficient use. Unlike the first and second generation cellular networks, current generations do not therefore separate geographical sites in frequency. This universal frequency reuse, combined with continuously increasing spatial density of the transmitters, leads to challenging interference levels in the network. This dissertation collects several contributions to analysis and mitigation of interference in cellular wireless networks. The contributions are categorized and set in the context of prior art based on key characteristics, then they are treated one by one. The first contribution encompasses dynamic signaling that measures instantaneous interference situations and allows only for such transmissions that do not harm each other excessively. A novel forward signaling approach is introduced as an alternative to traditional reverse signaling. Forward signaling allows the interference management decisions to be done at the receiver, where there is more relevant information available. The second contribution analyzes cross-link interference in heterogeneous networks. Cross-link interference is interference between downlink and uplink transmissions that can appear in time-division duplex (TDD) networks. It is shown that uplink reception of small cells can be disturbed considerably by macrocell downlink transmissions. We proposes an intuitive solution to the problem based on power control. Users in small cells have generally enough power headroom as the distance to the small base station is often short. The third contribution provides an extensive analysis of a specific interference managment method that the Long-Term Evolution (LTE) applies in cochannel heterogeneous deployments. We analyze this so-called time muting using a modern stochastic geometry approach and show that performance of the method strongly depends on residual interference in the muted sections of time. The fourth and last contribution analyzes the impact of interference rank, i.e., number of spatial streams at the interferer, on a beamformed or spatially block coded transmission. It is shown that when the interferer chooses to transmit multiple spatial streams, spreading the power in spatial domain has potential to decrease probability of outage at neighbor receiver, especially if the neighbor transmission uses beamforming
    corecore