42 research outputs found

    Resource allocation for 5G technologies under statistical queueing constraints

    Get PDF
    As the launch of fifth generation (5G) wireless networks is approaching, recent years have witnessed comprehensive discussions about a possible 5G standard. Many transmission scenarios and technologies have been proposed and initial over-the-air experimental trials have been conducted. Most of the existing literature studies on 5G technologies have mainly focused on the physical layer parameters and quality of service (QoS) requirements, e.g., achievable data rates. However, the demand for delay-sensitive data traffic over wireless networks has increased exponentially in the recent years, and is expected to further increase by the time of 5G. Therefore, other constraints at the data-link layer concerning the buffer overflow and delay violation probabilities should also be regarded. It follows that evaluating the performance of the 5G technologies when such constraints are considered is a timely task. Motivated by this fact, in this thesis we explore the performance of three promising 5G technologies when operating under certain QoS at the data-link layer. We follow a cross-layer approach to examine the interplay between the physical and data-link layers when statistical QoS constraints are inflicted in the form of limits on the delay violation and buffer overflow probabilities. Noting that wireless systems, generally, have limited physical resources, in this thesis we mainly target designing adaptive resource allocation schemes to maximize the system performance under such QoS constraints. We initially investigate the throughput and energy efficiency of a general class of multiple-input multiple-output (MIMO) systems with arbitrary inputs. As a cross-layer evaluation tool, we employ the effective capacity as the main performance metric, which is the maximum constant data arrival rate at a buffer that can be sustained by the channel service process under specified QoS constraints. We obtain the optimal input covariance matrix that maximizes the effective capacity under a short-term average power budget. Then, we perform an asymptotic analysis of the effective capacity in the low signal-to-noise ratio and large-scale antenna (massive MIMO) regimes. Such analysis has a practical importance for 5G scenarios that necessitate low latency, low power consumption, and/or ability to simultaneously support massive number of users. Non-orthogonal multiple access (NOMA) has attracted significant attention in the recent years as a promising multiple access technology for 5G. In this thesis, we consider a two-user power-domain NOMA scheme in which both transmitters employ superposition coding and the receiver applies successive interference cancellation (SIC) with a certain order. For practical concerns, we consider limited transmission power budgets at the transmitters, and assume that both transmitters have arbitrarily distributed input signals. We again exploit the effective capacity as the main cross-layer performance measure. We provide a resource management scheme that can jointly obtain the optimal power allocation policies at the transmitters and the optimal decoding order at the receiver, with the goal of maximizing the effective capacity region that provides the maximum allowable sustainable arrival rate region at the transmitters' buffers under QoS guarantees. In the recent years, visible light communication (VLC) has emerged as a potential transmission technology that can utilize the visible light spectrum for data transmission along with illumination. Different from the existing literature studies on VLC, in this thesis we consider a VLC system in which the access point (AP) is unaware of the channel conditions, thus the AP sends the data at a fixed rate. Under this assumption, and considering an ON-OFF data source, we provide a cross-layer study when the system is subject to statistical buffering constraints. To this end, we employ the maximum average data arrival rate at the AP buffer and the non-asymptotic bounds on buffering delay as the main performance measures. To facilitate our analysis, we adopt a two-state Markov process to model the fixed-rate transmission strategy, and we then formulate the steady-state probabilities of the channel being in the ON and OFF states. The coexistence of radio frequency (RF) and VLC systems in typical indoor environments can be leveraged to support vast user QoS needs. In this thesis, we examine the benefits of employing both technologies when operating under statistical buffering limitations. Particularly, we consider a multi-mechanism scenario that utilizes RF and VLC links for data transmission in an indoor environment. As the transmission technology is the main physical resource to be concerned in this part, we propose a link selection process through which the transmitter sends data over the link that sustains the desired QoS guarantees the most. Considering an ON-OFF data source, we employ the maximum average data arrival rate at the transmitter buffer and the non-asymptotic bounds on data buffering delay as the main performance measures. We formulate the performance measures under the assumption that both links are subject to average and peak power constraints

    Studies and simulations of the DigiCipher system

    Get PDF
    During this period the development of simulators for the various high definition television (HDTV) systems proposed to the FCC was continued. The FCC has indicated that it wants the various proposers to collaborate on a single system. Based on all available information this system will look very much like the advanced digital television (ADTV) system with major contributions only from the DigiCipher system. The results of our simulations of the DigiCipher system are described. This simulator was tested using test sequences from the MPEG committee. The results are extrapolated to HDTV video sequences. Once again, some caveats are in order. The sequences used for testing the simulator and generating the results are those used for testing the MPEG algorithm. The sequences are of much lower resolution than the HDTV sequences would be, and therefore the extrapolations are not totally accurate. One would expect to get significantly higher compression in terms of bits per pixel with sequences that are of higher resolution. However, the simulator itself is a valid one, and should HDTV sequences become available, they could be used directly with the simulator. A brief overview of the DigiCipher system is given. Some coding results obtained using the simulator are looked at. These results are compared to those obtained using the ADTV system. These results are evaluated in the context of the CCSDS specifications and make some suggestions as to how the DigiCipher system could be implemented in the NASA network. Simulations such as the ones reported can be biased depending on the particular source sequence used. In order to get more complete information about the system one needs to obtain a reasonable set of models which mirror the various kinds of sources encountered during video coding. A set of models which can be used to effectively model the various possible scenarios is provided. As this is somewhat tangential to the other work reported, the results are included as an appendix

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Cross-layer latency-aware and -predictable data communication

    Get PDF
    Cyber-physical systems are making their way into more aspects of everyday life. These systems are increasingly distributed and hence require networked communication to coordinatively fulfil control tasks. Providing this in a robust and resilient manner demands for latency-awareness and -predictability at all layers of the communication and computation stack. This thesis addresses how these two latency-related properties can be implemented at the transport layer to serve control applications in ways that traditional approaches such as TCP or RTP cannot. Thereto, the Predictably Reliable Real-time Transport (PRRT) protocol is presented, including its unique features (e.g. partially reliable, ordered, in-time delivery, and latency-avoiding congestion control) and unconventional APIs. This protocol has been intensively evaluated using the X-Lap toolkit that has been specifically developed to support protocol designers in improving latency, timing, and energy characteristics of protocols in a cross-layer, intra-host fashion. PRRT effectively circumvents latency-inducing bufferbloat using X-Pace, an implementation of the cross-layer pacing approach presented in this thesis. This is shown using experimental evaluations on real Internet paths. Apart from PRRT, this thesis presents means to make TCP-based transport aware of individual link latencies and increases the predictability of the end-to-end delays using Transparent Transmission Segmentation.Cyber-physikalische Systeme werden immer relevanter fĂŒr viele Aspekte des Alltages. Sie sind zunehmend verteilt und benötigen daher Netzwerktechnik zur koordinierten ErfĂŒllung von Regelungsaufgaben. Um dies auf eine robuste und zuverlĂ€ssige Art zu tun, ist Latenz-Bewusstsein und -PrĂ€dizierbarkeit auf allen Ebenen der Informations- und Kommunikationstechnik nötig. Diese Dissertation beschĂ€ftigt sich mit der Implementierung dieser zwei Latenz-Eigenschaften auf der Transport-Schicht, sodass Regelungsanwendungen deutlich besser unterstĂŒtzt werden als es traditionelle AnsĂ€tze, wie TCP oder RTP, können. Hierzu wird das PRRT-Protokoll vorgestellt, inklusive seiner besonderen Eigenschaften (z.B. partiell zuverlĂ€ssige, geordnete, rechtzeitige Auslieferung sowie Latenz-vermeidende Staukontrolle) und unkonventioneller API. Das Protokoll wird mit Hilfe von X-Lap evaluiert, welches speziell dafĂŒr entwickelt wurde Protokoll-Designer dabei zu unterstĂŒtzen die Latenz-, Timing- und Energie-Eigenschaften von Protokollen zu verbessern. PRRT vermeidet Latenz-verursachenden Bufferbloat mit Hilfe von X-Pace, einer Cross-Layer Pacing Implementierung, die in dieser Arbeit prĂ€sentiert und mit Experimenten auf realen Internet-Pfaden evaluiert wird. Neben PRRT behandelt diese Arbeit transparente Übertragungssegmentierung, welche dazu dient dem TCP-basierten Transport individuelle Link-Latenzen bewusst zu machen und so die Vorhersagbarkeit der Ende-zu-Ende Latenz zu erhöhen
    corecore