685 research outputs found

    Code Design Principles for Ultra-Reliable Random Access with Preassigned Patterns

    Get PDF
    We study medium access control layer random access under the assumption that the receiver can perform successive interference cancellation, without feedback. During recent years, a number of protocols with impressive error performance have been suggested for this channel model. However, the random nature of these protocols causes an error floor which limits their usability when targeting ultra-reliable communications. In very recent works by Paolini et al. and Boyd et. al., it was shown that if each user employs predetermined combinatorial access patterns, this error floor disappears. In this paper, we develop code design criteria for deterministic random access protocols in the ultra-reliability region, and build codes based on these principles. The suggested design methods are supported by simulations.Peer reviewe

    Enabling Ultra-Reliable and Low-Latency Communications through Unlicensed Spectrum

    Full text link
    © 2018 IEEE. In this article, we aim to address the question of how to exploit the unlicensed spectrum to achieve URLLC. Potential URLLC PHY mechanisms are reviewed and then compared via simulations to demonstrate their potential benefits to URLLC. Although a number of important PHY techniques help with URLLC, the PHY layer exhibits an intrinsic trade-off between latency and reliability, posed by limited and unstable wireless channels. We then explore MAC mechanisms and discuss multi-channel strategies for achieving low-latency LTE unlicensed band access. We demonstrate, via simulations, that the periods without access to the unlicensed band can be substantially reduced by maintaining channel access processes on multiple unlicensed channels, choosing the channels intelligently, and implementing RTS/CTS

    Millimetre wave frequency band as a candidate spectrum for 5G network architecture : a survey

    Get PDF
    In order to meet the huge growth in global mobile data traffic in 2020 and beyond, the development of the 5th Generation (5G) system is required as the current 4G system is expected to fall short of the provision needed for such growth. 5G is anticipated to use a higher carrier frequency in the millimetre wave (mm-wave) band, within the 20 to 90 GHz, due to the availability of a vast amount of unexploited bandwidth. It is a revolutionary step to use these bands because of their different propagation characteristics, severe atmospheric attenuation, and hardware constraints. In this paper, we carry out a survey of 5G research contributions and proposed design architectures based on mm-wave communications. We present and discuss the use of mm-wave as indoor and outdoor mobile access, as a wireless backhaul solution, and as a key enabler for higher order sectorisation. Wireless standards such as IEE802.11ad, which are operating in mm-wave band have been presented. These standards have been designed for short range, ultra high data throughput systems in the 60 GHz band. Furthermore, this survey provides new insights regarding relevant and open issues in adopting mm-wave for 5G networks. This includes increased handoff rate and interference in Ultra-Dense Network (UDN), waveform consideration with higher spectral efficiency, and supporting spatial multiplexing in mm-wave line of sight. This survey also introduces a distributed base station architecture in mm-wave as an approach to address increased handoff rate in UDN, and to provide an alternative way for network densification in a time and cost effective manner

    Optimal Transmit Beamforming for Integrated Sensing and Communication

    Full text link
    This paper studies the transmit beamforming in a downlink integrated sensing and communication (ISAC) system, where a base station (BS) equipped with a uniform linear array (ULA) sends combined information-bearing and dedicated radar signals to simultaneously perform downlink multiuser communication and radar target sensing. Under this setup, we maximize the radar sensing performance (in terms of minimizing the beampattern matching errors or maximizing the minimum weighted beampattern gains), subject to the communication users' minimum signal-to-interference-plus-noise ratio (SINR) requirements and the BS's transmit power constraints. In particular, we consider two types of communication receivers, namely Type-I and Type-II receivers, which do not have and do have the capability of cancelling the interference from the {\emph{a-priori}} known dedicated radar signals, respectively. Under both Type-I and Type-II receivers, the beampattern matching and minimum weighted beampattern gain maximization problems are globally optimally solved via applying the semidefinite relaxation (SDR) technique together with the rigorous proof of the tightness of SDR for both Type-I and Type-II receivers under the two design criteria. It is shown that at the optimality, radar signals are not required with Type-I receivers under some specific conditions, while radar signals are always needed to enhance the performance with Type-II receivers. Numerical results show that the minimum weighted beampattern gain maximization leads to significantly higher beampattern gains at the worst-case sensing angles with a much lower computational complexity than the beampattern matching design. We show that by exploiting the capability of canceling the interference caused by the radar signals, the case with Type-II receivers results in better sensing performance than that with Type-I receivers and other conventional designs.Comment: submitted for possible journal publicatio
    • …
    corecore