3,109 research outputs found

    Dynamic Time-domain Duplexing for Self-backhauled Millimeter Wave Cellular Networks

    Full text link
    Millimeter wave (mmW) bands between 30 and 300 GHz have attracted considerable attention for next-generation cellular networks due to vast quantities of available spectrum and the possibility of very high-dimensional antenna ar-rays. However, a key issue in these systems is range: mmW signals are extremely vulnerable to shadowing and poor high-frequency propagation. Multi-hop relaying is therefore a natural technology for such systems to improve cell range and cell edge rates without the addition of wired access points. This paper studies the problem of scheduling for a simple infrastructure cellular relay system where communication between wired base stations and User Equipment follow a hierarchical tree structure through fixed relay nodes. Such a systems builds naturally on existing cellular mmW backhaul by adding mmW in the access links. A key feature of the proposed system is that TDD duplexing selections can be made on a link-by-link basis due to directional isolation from other links. We devise an efficient, greedy algorithm for centralized scheduling that maximizes network utility by jointly optimizing the duplexing schedule and resources allocation for dense, relay-enhanced OFDMA/TDD mmW networks. The proposed algorithm can dynamically adapt to loading, channel conditions and traffic demands. Significant throughput gains and improved resource utilization offered by our algorithm over the static, globally-synchronized TDD patterns are demonstrated through simulations based on empirically-derived channel models at 28 GHz.Comment: IEEE Workshop on Next Generation Backhaul/Fronthaul Networks - BackNets 201

    Measurement-Adaptive Cellular Random Access Protocols

    Get PDF
    This work considers a single-cell random access channel (RACH) in cellular wireless networks. Communications over RACH take place when users try to connect to a base station during a handover or when establishing a new connection. Within the framework of Self-Organizing Networks (SONs), the system should self- adapt to dynamically changing environments (channel fading, mobility, etc.) without human intervention. For the performance improvement of the RACH procedure, we aim here at maximizing throughput or alternatively minimizing the user dropping rate. In the context of SON, we propose protocols which exploit information from measurements and user reports in order to estimate current values of the system unknowns and broadcast global action-related values to all users. The protocols suggest an optimal pair of user actions (transmission power and back-off probability) found by minimizing the drift of a certain function. Numerical results illustrate considerable benefits of the dropping rate, at a very low or even zero cost in power expenditure and delay, as well as the fast adaptability of the protocols to environment changes. Although the proposed protocol is designed to minimize primarily the amount of discarded users per cell, our framework allows for other variations (power or delay minimization) as well.Comment: 31 pages, 13 figures, 3 tables. Springer Wireless Networks 201

    Goodbye, ALOHA!

    Get PDF
    ©2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.The vision of the Internet of Things (IoT) to interconnect and Internet-connect everyday people, objects, and machines poses new challenges in the design of wireless communication networks. The design of medium access control (MAC) protocols has been traditionally an intense area of research due to their high impact on the overall performance of wireless communications. The majority of research activities in this field deal with different variations of protocols somehow based on ALOHA, either with or without listen before talk, i.e., carrier sensing multiple access. These protocols operate well under low traffic loads and low number of simultaneous devices. However, they suffer from congestion as the traffic load and the number of devices increase. For this reason, unless revisited, the MAC layer can become a bottleneck for the success of the IoT. In this paper, we provide an overview of the existing MAC solutions for the IoT, describing current limitations and envisioned challenges for the near future. Motivated by those, we identify a family of simple algorithms based on distributed queueing (DQ), which can operate for an infinite number of devices generating any traffic load and pattern. A description of the DQ mechanism is provided and most relevant existing studies of DQ applied in different scenarios are described in this paper. In addition, we provide a novel performance evaluation of DQ when applied for the IoT. Finally, a description of the very first demo of DQ for its use in the IoT is also included in this paper.Peer ReviewedPostprint (author's final draft

    Resource Allocation for Network-Integrated Device-to-Device Communications Using Smart Relays

    Full text link
    With increasing number of autonomous heterogeneous devices in future mobile networks, an efficient resource allocation scheme is required to maximize network throughput and achieve higher spectral efficiency. In this paper, performance of network-integrated device-to-device (D2D) communication is investigated where D2D traffic is carried through relay nodes. An optimization problem is formulated for allocating radio resources to maximize end-to-end rate as well as conversing QoS requirements for cellular and D2D user equipment under total power constraint. Numerical results show that there is a distance threshold beyond which relay-assisted D2D communication significantly improves network performance when compared to direct communication between D2D peers

    Joint Dynamic Radio Resource Allocation and Mobility Load Balancing in 3GPP LTE Multi-Cell Network

    Get PDF
    Load imbalance, together with inefficient utilization of system resource, constitute major factors responsible for poor overall performance in Long Term Evolution (LTE) network. In this paper, a novel scheme of joint dynamic resource allocation and load balancing is proposed to achieve a balanced performance improvement in 3rd Generation Partnership Project (3GPP) LTE Self-Organizing Networks (SON). The new method which aims at maximizing network resource efficiency subject to inter-cell interference and intra-cell resource constraints is implemented in two steps. In the first step, an efficient resource allocation, including user scheduling and power assignment, is conducted in a distributed manner to serve as many users in the whole network as possible. In the second step, based on the resource allocation scheme, the optimization objective namely network resource efficiency can be calculated and load balancing is implemented by switching the user that can maximize the objective function. Lagrange Multipliers method and heuristic algorithm are used to resolve the formulated optimization problem. Simulation results show that our algorithm achieves better performance in terms of user throughput, fairness, load balancing index and unsatisfied user number compared with the traditional approach which takes resource allocation and load balancing into account, respectively

    Ruin Theory for Dynamic Spectrum Allocation in LTE-U Networks

    Full text link
    LTE in the unlicensed band (LTE-U) is a promising solution to overcome the scarcity of the wireless spectrum. However, to reap the benefits of LTE-U, it is essential to maintain its effective coexistence with WiFi systems. Such a coexistence, hence, constitutes a major challenge for LTE-U deployment. In this paper, the problem of unlicensed spectrum sharing among WiFi and LTE-U system is studied. In particular, a fair time sharing model based on \emph{ruin theory} is proposed to share redundant spectral resources from the unlicensed band with LTE-U without jeopardizing the performance of the WiFi system. Fairness among both WiFi and LTE-U is maintained by applying the concept of the probability of ruin. In particular, the probability of ruin is used to perform efficient duty-cycle allocation in LTE-U, so as to provide fairness to the WiFi system and maintain certain WiFi performance. Simulation results show that the proposed ruin-based algorithm provides better fairness to the WiFi system as compared to equal duty-cycle sharing among WiFi and LTE-U.Comment: Accepted in IEEE Communications Letters (09-Dec 2018
    • …
    corecore