9 research outputs found

    Designing Wireless Networks for Delay-Sensitive Internet of Things

    Get PDF
    Internet of Things (IoT) applications have stringent requirements on the wireless network delay, but have to share and compete for the limited bandwidth with other wireless traffic. Traditional schemes adopt various QoS-aware traffic scheduling techniques, but fail when the amount of network traffic further increases. In addition, CSMA with collision avoidance (CSMA/CA) mechanism enables the coexistence of multiple wireless links but avoids concurrent transmissions, yielding severe channel access delay on the delay-sensitive traffic when the channel is busy. To address the aforementioned limitations, we present two novel designs of wireless side channel, which operate concurrently with the existing wireless network channel without occupying extra spectrum, but dedicates to real-time traffic. Our key insight of realizing such side channel is to exploit the excessive SNR margin in the wireless network by encoding data as patterned interference. First, we design such patterned interference in form of energy erasure over specific subcarriers in OFDM systems. Delay-sensitive messages can be delivered simultaneously along with other traffic from the same transmitter, which reduces the network queuing delay. Furthermore, we propose EasyPass, another side channel design that encodes data in the same OFDM scheme as being used by the main channel, but using weaker power and narrower frequency bands. By adapting the side channel's transmit power under the main channel's SNR margin, the simultaneous main channel transmission would suffer little degradation. EasyPass reduces the channel access delay by providing extra transmission opportunities when the channel is occupied by other links. Last, we present a novel modulation design that transforms the choices of link rate adaptation from discrete to continuous. With minimum extra overhead, it improves the network throughput and therefore reduces the network delay

    Cellular, Wide-Area, and Non-Terrestrial IoT: A Survey on 5G Advances and the Road Towards 6G

    Full text link
    The next wave of wireless technologies is proliferating in connecting things among themselves as well as to humans. In the era of the Internet of things (IoT), billions of sensors, machines, vehicles, drones, and robots will be connected, making the world around us smarter. The IoT will encompass devices that must wirelessly communicate a diverse set of data gathered from the environment for myriad new applications. The ultimate goal is to extract insights from this data and develop solutions that improve quality of life and generate new revenue. Providing large-scale, long-lasting, reliable, and near real-time connectivity is the major challenge in enabling a smart connected world. This paper provides a comprehensive survey on existing and emerging communication solutions for serving IoT applications in the context of cellular, wide-area, as well as non-terrestrial networks. Specifically, wireless technology enhancements for providing IoT access in fifth-generation (5G) and beyond cellular networks, and communication networks over the unlicensed spectrum are presented. Aligned with the main key performance indicators of 5G and beyond 5G networks, we investigate solutions and standards that enable energy efficiency, reliability, low latency, and scalability (connection density) of current and future IoT networks. The solutions include grant-free access and channel coding for short-packet communications, non-orthogonal multiple access, and on-device intelligence. Further, a vision of new paradigm shifts in communication networks in the 2030s is provided, and the integration of the associated new technologies like artificial intelligence, non-terrestrial networks, and new spectra is elaborated. Finally, future research directions toward beyond 5G IoT networks are pointed out.Comment: Submitted for review to IEEE CS&

    Gestion dynamique de ressources appliquée aux réseaux cellulaires avec interférence

    Get PDF
    The aim of this thesis is to design, implement and evaluate practical cross-layer algorithms. We focus on LTE and post-LTE uncoordinated networks where interference is a key issue given the new traffic patterns. The goal is to allocate the radio resources in an efficient way. We develop mathematical and computational interference models that allow us to understand the behavior of such networks and we apply an information-theoretic approach to different interference scenarios and traffic characteristics. We have tried to remain as close as possible to practical systems to be able to test the feasibility of the proposed techniques. The thesis deals with performance evaluation of interference scenarios in 4G networks, in particular those arising from small-cell deployments. The work in this thesis also deals with analysis of resource-allocation and incremental-redundancy based hybrid automatic repeat request (HARQ) for bursty interference (or more general time-varying channels) which allows for only partial channel state information at the transmitter. The work is then applied to practical scheduler design for LTE base stations and includes performance analysis for real LTE modems.L'objectif de cette thèse est de concevoir, implémenter et évaluer les algorithmes cross-layer pratiques. Nous nous concentrons sur la technologie LTE et les réseaux non coordonnés post-LTE où l'interférence est un enjeu majeur compte tenu des nouvelles tendances du trafic. L'objectif est d'allouer les ressources radio d'une manière efficace. Nous développons des modèles d'interférence mathématiques et informatiques qui nous permettent de comprendre le comportement de ces réseaux et nous appliquons une approche basée sur la théorie de l'information à différents scénarios d'interférence et caractéristiques du trafic. Nous avons essayé de s'approcher le plus possible de systèmes réels pour être en mesure de tester la faisabilité des techniques proposées. La thèse porte sur l'évaluation de la performance des scénarios d'interférence dans les réseaux 4G, en particulier celles qui découlent du déploiements de cellules de petite taille ("small cells"). Le travail dans cette thèse s'adresse également à l'analyse de l'allocation des ressources et la requête de répétition automatique hybride (HARQ) à redondance incrémentale pour les interférences sporadiques (de façon plus générale les canaux variables dans le temps) qui permet uniquement des informations partielles de l'état du canal à l'émetteur. Ce travail est ensuite appliquée à la conception d'ordonnanceur pour les stations de base LTE et inclut une analyse de performance pour les modems LTE réels

    How Can Optical Communications Shape the Future of Deep Space Communications? A Survey

    Full text link
    With a large number of deep space (DS) missions anticipated by the end of this decade, reliable and high capacity DS communications systems are needed more than ever. Nevertheless, existing DS communications technologies are far from meeting such a goal. Improving current DS communications systems does not only require system engineering leadership but also, very crucially, an investigation of potential emerging technologies that overcome the unique challenges of ultra-long DS communications links. To the best of our knowledge, there has not been any comprehensive surveys of DS communications technologies over the last decade. Free space optical (FSO) technology is an emerging DS technology, proven to acquire lower communications systems size, weight, and power (SWaP) and achieve a very high capacity compared to its counterpart radio frequency (RF) technology, the current used DS technology. In this survey, we discuss the pros and cons of deep space optical communications (DSOC). Furthermore, we review the modulation, coding, and detection, receiver, and protocols schemes and technologies for DSOC. We provide, for the very first time, thoughtful discussions about implementing orbital angular momentum (OAM) and quantum communications (QC) for DS. We elaborate on how these technologies among other field advances, including interplanetary network, and RF/FSO systems improve reliability, capacity, and security and address related implementation challenges and potential solutions. This paper provides a holistic survey in DSOC technologies gathering 200+ fragmented literature and including novel perspectives aiming to setting the stage for more developments in the field.Comment: 17 pages, 8 Figure

    Non-Orthogonal Signal and System Design for Wireless Communications

    Get PDF
    The thesis presents research in non-orthogonal multi-carrier signals, in which: (i) a new signal format termed truncated orthogonal frequency division multiplexing (TOFDM) is proposed to improve data rates in wireless communication systems, such as those used in mobile/cellular systems and wireless local area networks (LANs), and (ii) a new design and experimental implementation of a real-time spectrally efficient frequency division multiplexing (SEFDM) system are reported. This research proposes a modified version of the orthogonal frequency division multiplexing (OFDM) format, obtained by truncating OFDM symbols in the time-domain. In TOFDM, subcarriers are no longer orthogonally packed in the frequency-domain as time samples are only partially transmitted, leading to improved spectral efficiency. In this work, (i) analytical expressions are derived for the newly proposed TOFDM signal, followed by (ii) interference analysis, (iii) systems design for uncoded and coded schemes, (iv) experimental implementation and (v) performance evaluation of the new proposed signal and system, with comparisons to conventional OFDM systems. Results indicate that signals can be recovered with truncated symbol transmission. Based on the TOFDM principle, a new receiving technique, termed partial symbol recovery (PSR), is designed and implemented in software de ned radio (SDR), that allows efficient operation of two users for overlapping data, in wireless communication systems operating with collisions. The PSR technique is based on recovery of collision-free partial OFDM symbols, followed by the reconstruction of complete symbols to recover progressively the frames of two users suffering collisions. The system is evaluated in a testbed of 12-nodes using SDR platforms. The thesis also proposes channel estimation and equalization technique for non-orthogonal signals in 5G scenarios, using an orthogonal demodulator and zero padding. Finally, the implementation of complete SEFDM systems in real-time is investigated and described in detail

    Energy efficient and low complexity techniques for the next generation millimeter wave hybrid MIMO systems

    Get PDF
    The fifth generation (and beyond) wireless communication systems require increased capacity, high data rates, improved coverage and reduced energy consumption. This can be potentially provided by unused available spectrum such as the Millimeter Wave (MmWave) frequency spectrum above 30 GHz. The high bandwidths for mmWave communication compared to sub-6 GHz microwave frequency bands must be traded off against increased path loss, which can be compensated using large-scale antenna arrays such as the Multiple-Input Multiple- Output (MIMO) systems. The analog/digital Hybrid Beamforming (HBF) architectures for mmWave MIMO systems reduce the hardware complexity and power consumption using fewer Radio Frequency (RF) chains and support multi-stream communication with high Spectral Efficiency (SE). Such systems can also be optimized to achieve high Energy Efficiency (EE) gains with low complexity but this has not been widely studied in the literature. This PhD project focussed on designing energy efficient and low complexity communication techniques for next generation mmWave hybrid MIMO systems. Firstly, a novel architecture with a framework that dynamically activates the optimal number of RF chains was designed. Fractional programming was used to solve an EE maximization problem and the Dinkelbach Method (DM) based framework was exploited to optimize the number of active RF chains and the data streams. The DM is an iterative and parametric algorithm where a sequence of easier problems converge to the global solution. The HBF matrices were designed using a codebook-based fast approximation solution called gradient pursuit which was introduced as a cost-effective and fast approximation algorithm. This work maximizes EE by exploiting the structure of RF chains with full resolution sampling unlike existing baseline approaches that use fixed RF chains and aim only for high SE. Secondly, an efficient sparse mmWave channel estimation algorithm was developed with low resolution Analog-to-Digital Converters (ADCs) at the receiver. The sparsity of the mmWave channel was exploited and the estimation problem was tackled using compressed sensing through the Stein's unbiased risk estimate based parametric denoiser. The Expectation-maximization density estimation was used to avoid the need to specify the channel statistics. Furthermore, an energy efficient mmWave hybrid MIMO system was developed with Digital-to- Analog Converters (DACs) at the transmitter where the best subset of the active RF chains and the DAC resolution were selected. A novel technique based on the DM and subset selection optimization was implemented for EE maximization. This work exploits the low resolution sampling at the converting units and provides more efficient solutions in terms of EE and channel estimation than existing baselines in the literature. Thirdly, the DAC and ADC bit resolutions and the HBF matrices were jointly optimized for EE maximization. The flexibility in choosing the bit resolution for each DAC and ADC was considered and they were optimized on a frame-by-frame basis unlike the existing approaches, based on the fixed resolution sampling. A novel decomposition of the HBF matrices to three parts was introduced to represent the analog beamformer matrix, the DAC/ADC bit resolution matrix and the baseband beamformer matrix. The alternating direction method of multipliers was used to solve this matrix factorization problem as it has been successfully applied to other non-convex matrix factorization problems in the literature. This work considers EE maximization with low resolution sampling at both the DACs and the ADCs simultaneously, and jointly optimizes the HBF and DAC/ADC bit resolution matrices, unlike the existing baselines that use fixed bit resolution or otherwise optimize either DAC/ADC bit resolution or HBF matrices

    Research and Implementation of Rateless Spinal Codes Based Massive MIMO System

    No full text
    The potential performance gains promised by massive multi-input and multioutput (MIMO) rely heavily on the access to accurate channel state information (CSI), which is difficult to obtain in practice when channel coherence time is short and the number of mobile users is huge. To make the system with imperfect CSI perform well, we propose a rateless codes-aided massive MIMO scheme, with the aim of approaching the maximum achievable rate (MAR) as well as improving the achieved rate over that based on the fixed-rate codes. More explicitly, a recently proposed family of rateless codes, called spinal codes, are applied to massive MIMO systems, where the spinal codes bring the benefit of approximately achieving the MAR with sufficiently large encoding block size. In addition, the multilevel puncturing and dynamic block-size allocation (MPDBA) scheme is proposed, where the block sizes are determined by user MAR to curb the average retransmission delay for successfully decoding the messages, which further enhances the system retransmission efficiency. Multilevel puncturing, which is MAR dependent, narrows the gap between the system MAR and the related achieved rate. Theoretical analysis is provided to demonstrate that spinal codes with the MPDBA can guarantee the system retransmission efficiency as well as achieved rate, which are also verified by numerical simulations. Finally, a simplified but comparable MIMO testbed with 2 transmit antennas and 2 single-antenna users, based on NI Universal Software Radio Peripheral (USRP) and LabVIEW communication toolkits, is built up to demonstrate the effectiveness of our proposal in realistic wireless channels, which is easy to be extended to massive MIMO scenarios in future
    corecore