163 research outputs found

    Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks

    Get PDF
    Soaring capacity and coverage demands dictate that future cellular networks need to soon migrate towards ultra-dense networks. However, network densification comes with a host of challenges that include compromised energy efficiency, complex interference management, cumbersome mobility management, burdensome signaling overheads and higher backhaul costs. Interestingly, most of the problems, that beleaguer network densification, stem from legacy networks' one common feature i.e., tight coupling between the control and data planes regardless of their degree of heterogeneity and cell density. Consequently, in wake of 5G, control and data planes separation architecture (SARC) has recently been conceived as a promising paradigm that has potential to address most of aforementioned challenges. In this article, we review various proposals that have been presented in literature so far to enable SARC. More specifically, we analyze how and to what degree various SARC proposals address the four main challenges in network densification namely: energy efficiency, system level capacity maximization, interference management and mobility management. We then focus on two salient features of future cellular networks that have not yet been adapted in legacy networks at wide scale and thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and device-to-device (D2D) communications. After providing necessary background on CoMP and D2D, we analyze how SARC can particularly act as a major enabler for CoMP and D2D in context of 5G. This article thus serves as both a tutorial as well as an up to date survey on SARC, CoMP and D2D. Most importantly, the article provides an extensive outlook of challenges and opportunities that lie at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201

    Tutorial on LTE/LTE-A Cellular Network Dimensioning Using Iterative Statistical Analysis

    Get PDF
    LTE is the fastest growing cellular technology and is expected to increase its footprint in the coming years, as well as progress toward LTE-A. The race among operators to deliver the expected quality of experience to their users is tight and demands sophisticated skills in network planning. Radio network dimensioning (RND) is an essential step in the process of network planning and has been used as a fast, but indicative, approximation of radio site count. RND is a prerequisite to the lengthy process of thorough planning. Moreover, results from RND are used by players in the industry to estimate preplanning costs of deploying and running a network; thus, RND is, as well, a key tool in cellular business modelling. In this work, we present a tutorial on radio network dimensioning, focused on LTE/LTE-A, using an iterative approach to find a balanced design that mediates among the three design requirements: coverage, capacity, and quality. This approach uses a statistical link budget analysis methodology, which jointly accounts for small and large scale fading in the channel, as well as loading due to traffic demand, in the interference calculation. A complete RND manual is thus presented, which is of key importance to operators deploying or upgrading LTE/LTE-A networks for two reasons. It is purely analytical, hence it enables fast results, a prime factor in the race undertaken. Moreover, it captures essential variables affecting network dimensions and manages conflicting targets to ensure user quality of experience, another major criterion in the competition. The described approach is compared to the traditional RND using a commercial LTE network planning tool. The outcome further dismisses the traditional RND for LTE due to unjustified increase in number of radio sites and related cost, and motivates further research in developing more effective and novel RND procedures

    Relaying in the Internet of Things (IoT): A Survey

    Get PDF
    The deployment of relays between Internet of Things (IoT) end devices and gateways can improve link quality. In cellular-based IoT, relays have the potential to reduce base station overload. The energy expended in single-hop long-range communication can be reduced if relays listen to transmissions of end devices and forward these observations to gateways. However, incorporating relays into IoT networks faces some challenges. IoT end devices are designed primarily for uplink communication of small-sized observations toward the network; hence, opportunistically using end devices as relays needs a redesign of both the medium access control (MAC) layer protocol of such end devices and possible addition of new communication interfaces. Additionally, the wake-up time of IoT end devices needs to be synchronized with that of the relays. For cellular-based IoT, the possibility of using infrastructure relays exists, and noncellular IoT networks can leverage the presence of mobile devices for relaying, for example, in remote healthcare. However, the latter presents problems of incentivizing relay participation and managing the mobility of relays. Furthermore, although relays can increase the lifetime of IoT networks, deploying relays implies the need for additional batteries to power them. This can erode the energy efficiency gain that relays offer. Therefore, designing relay-assisted IoT networks that provide acceptable trade-offs is key, and this goes beyond adding an extra transmit RF chain to a relay-enabled IoT end device. There has been increasing research interest in IoT relaying, as demonstrated in the available literature. Works that consider these issues are surveyed in this paper to provide insight into the state of the art, provide design insights for network designers and motivate future research directions
    • …
    corecore