311 research outputs found

    Load & Backhaul Aware Decoupled Downlink/Uplink Access in 5G Systems

    Full text link
    Until the 4th Generation (4G) cellular 3GPP systems, a user equipment's (UE) cell association has been based on the downlink received power from the strongest base station. Recent work has shown that - with an increasing degree of heterogeneity in emerging 5G systems - such an approach is dramatically suboptimal, advocating for an independent association of the downlink and uplink where the downlink is served by the macro cell and the uplink by the nearest small cell. In this paper, we advance prior art by explicitly considering the cell-load as well as the available backhaul capacity during the association process. We introduce a novel association algorithm and prove its superiority w.r.t. prior art by means of simulations that are based on Vodafone's small cell trial network and employing a high resolution pathloss prediction and realistic user distributions. We also study the effect that different power control settings have on the performance of our algorithm.Comment: 6 pages, 6 figures. Submitted to the IEEE International Conference on Communications (ICC 2015

    Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks

    Get PDF
    Soaring capacity and coverage demands dictate that future cellular networks need to soon migrate towards ultra-dense networks. However, network densification comes with a host of challenges that include compromised energy efficiency, complex interference management, cumbersome mobility management, burdensome signaling overheads and higher backhaul costs. Interestingly, most of the problems, that beleaguer network densification, stem from legacy networks' one common feature i.e., tight coupling between the control and data planes regardless of their degree of heterogeneity and cell density. Consequently, in wake of 5G, control and data planes separation architecture (SARC) has recently been conceived as a promising paradigm that has potential to address most of aforementioned challenges. In this article, we review various proposals that have been presented in literature so far to enable SARC. More specifically, we analyze how and to what degree various SARC proposals address the four main challenges in network densification namely: energy efficiency, system level capacity maximization, interference management and mobility management. We then focus on two salient features of future cellular networks that have not yet been adapted in legacy networks at wide scale and thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and device-to-device (D2D) communications. After providing necessary background on CoMP and D2D, we analyze how SARC can particularly act as a major enabler for CoMP and D2D in context of 5G. This article thus serves as both a tutorial as well as an up to date survey on SARC, CoMP and D2D. Most importantly, the article provides an extensive outlook of challenges and opportunities that lie at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201

    A Data-Aided Channel Estimation Scheme for Decoupled Systems in Heterogeneous Networks

    Get PDF
    Uplink/downlink (UL/DL) decoupling promises more flexible cell association and higher throughput in heterogeneous networks (HetNets), however, it hampers the acquisition of DL channel state information (CSI) in time-division-duplex (TDD) systems due to different base stations (BSs) connected in UL/DL. In this paper, we propose a novel data-aided (DA) channel estimation scheme to address this problem by utilizing decoded UL data to exploit CSI from received UL data signal in decoupled HetNets where a massive multiple-input multiple-output BS and dense small cell BSs are deployed. We analytically estimate BER performance of UL decoded data, which are used to derive an approximated normalized mean square error (NMSE) expression of the DA minimum mean square error (MMSE) estimator. Compared with the conventional least square (LS) and MMSE, it is shown that NMSE performances of all estimators are determined by their signal-to-noise ratio (SNR)-like terms and there is an increment consisting of UL data power, UL data length and BER values in the SNR-like term of DA method, which suggests DA method outperforms the conventional ones in any scenarios. Higher UL data power, longer UL data length and better BER performance lead to more accurate estimated channels with DA method. Numerical results verify that the analytical BER and NMSE results are close to the simulated ones and a remarkable gain in both NMSE and DL rate can be achieved by DA method in multiple scenarios with different modulations

    User Association in 5G Networks: A Survey and an Outlook

    Get PDF
    26 pages; accepted to appear in IEEE Communications Surveys and Tutorial

    Decoupled Uplink and Downlink in a Wireless System with Buffer-Aided Relaying

    Full text link
    The paper treats a multiuser relay scenario where multiple user equipments (UEs) have a two-way communication with a common Base Station (BS) in the presence of a buffer-equipped Relay Station (RS). Each of the uplink (UL) and downlink (DL) transmission can take place over a direct or over a relayed path. Traditionally, the UL and the DL path of a given two-way link are coupled, that is, either both are direct links or both are relayed links. By removing the restriction for coupling, one opens the design space for a decoupled two-way links. Following this, we devise two protocols: orthogonal decoupled UL/DL buffer-aided (ODBA) relaying protocol and non-orthogonal decoupled UL/DL buffer-aided (NODBA) relaying protocol. In NODBA, the receiver can use successive interference cancellation (SIC) to extract the desired signal from a collision between UL and DL signals. For both protocols, we characterize the transmission decision policies in terms of maximization of the average two-way sum rate of the system. The numerical results show that decoupling association and non-orthogonal radio access lead to significant throughput gains for two-way traffic.Comment: 27 pages, 10 figures, submitted to IEEE Transactions on Communication

    MM-Wave HetNet in 5G and beyond Cellular Networks Reinforcement Learning Method to improve QoS and Exploiting Path Loss Model

    Get PDF
    This paper presents High density heterogeneous networks (HetNet) which are the most promising technology for the fifth generation (5G) cellular network. Since 5G will be available for a long time, previous generation networking systems will need customization and updates. We examine the merits and drawbacks of legacy and Q-Learning (QL)-based adaptive resource allocation systems. Furthermore, various comparisons between methods and schemes are made for the purpose of evaluating the solutions for future generation. Microwave macro cells are used to enable extra high capacity such as Long-Term Evolution (LTE), eNodeB (eNB), and Multimedia Communications Wireless technology (MC), in which they are most likely to be deployed. This paper also presents four scenarios for 5G mm-Wave implementation, including proposed system architectures. The WL algorithm allocates optimal power to the small cell base station (SBS) to satisfy the minimum necessary capacity of macro cell user equipment (MUEs) and small cell user equipment (SCUEs) in order to provide quality of service (QoS) (SUEs). The challenges with dense HetNet and the massive backhaul traffic they generate are discussed in this study. Finally, a core HetNet design based on clusters is aimed at reducing backhaul traffic. According to our findings, MM-wave HetNet and MEC can be useful in a wide range of applications, including ultra-high data rate and low latency communications in 5G and beyond. We also used the channel model simulator to examine the directional power delay profile with received signal power, path loss, and path loss exponent (PLE) for both LOS and NLOS using uniform linear array (ULA) 2X2 and 64x16 antenna configurations at 38 GHz and 73 GHz mmWave bands for both LOS and NLOS (NYUSIM). The simulation results show the performance of several path loss models in the mmWave and sub-6 GHz bands. The path loss in the close-in (CI) model at mmWave bands is higher than that of open space and two ray path loss models because it considers all shadowing and reflection effects between transmitter and receiver. We also compared the suggested method to existing models like Amiri, Su, Alsobhi, Iqbal, and greedy (non adaptive), and found that it not only enhanced MUE and SUE minimum capacities and reduced BT complexity, but it also established a new minimum QoS threshold. We also talked about 6G researches in the future. When compared to utilizing the dual slope route loss model alone in a hybrid heterogeneous network, our simulation findings show that decoupling is more visible when employing the dual slope path loss model, which enhances system performance in terms of coverage and data rate
    • …
    corecore