148 research outputs found

    Datacenter Design for Future Cloud Radio Access Network.

    Full text link
    Cloud radio access network (C-RAN), an emerging cloud service that combines the traditional radio access network (RAN) with cloud computing technology, has been proposed as a solution to handle the growing energy consumption and cost of the traditional RAN. Through aggregating baseband units (BBUs) in a centralized cloud datacenter, C-RAN reduces energy and cost, and improves wireless throughput and quality of service. However, designing a datacenter for C-RAN has not yet been studied. In this dissertation, I investigate how a datacenter for C-RAN BBUs should be built on commodity servers. I first design WiBench, an open-source benchmark suite containing the key signal processing kernels of many mainstream wireless protocols, and study its characteristics. The characterization study shows that there is abundant data level parallelism (DLP) and thread level parallelism (TLP). Based on this result, I then develop high performance software implementations of C-RAN BBU kernels in C++ and CUDA for both CPUs and GPUs. In addition, I generalize the GPU parallelization techniques of the Turbo decoder to the trellis algorithms, an important family of algorithms that are widely used in data compression and channel coding. Then I evaluate the performance of commodity CPU servers and GPU servers. The study shows that the datacenter with GPU servers can meet the LTE standard throughput with 4× to 16× fewer machines than with CPU servers. A further energy and cost analysis show that GPU servers can save on average 13× more energy and 6× more cost. Thus, I propose the C-RAN datacenter be built using GPUs as a server platform. Next I study resource management techniques to handle the temporal and spatial traffic imbalance in a C-RAN datacenter. I propose a “hill-climbing” power management that combines powering-off GPUs and DVFS to match the temporal C-RAN traffic pattern. Under a practical traffic model, this technique saves 40% of the BBU energy in a GPU-based C-RAN datacenter. For spatial traffic imbalance, I propose three workload distribution techniques to improve load balance and throughput. Among all three techniques, pipelining packets has the most throughput improvement at 10% and 16% for balanced and unbalanced loads, respectively.PhDComputer Science and EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/120825/1/qizheng_1.pd

    Performance Comparison of Dual Connectivity and Hard Handover for LTE-5G Tight Integration in mmWave Cellular Networks

    Get PDF
    MmWave communications are expected to play a major role in the Fifth generation of mobile networks. They offer a potential multi-gigabit throughput and an ultra-low radio latency, but at the same time suffer from high isotropic pathloss, and a coverage area much smaller than the one of LTE macrocells. In order to address these issues, highly directional beamforming and a very high-density deployment of mmWave base stations were proposed. This Thesis aims to improve the reliability and performance of the 5G network by studying its tight and seamless integration with the current LTE cellular network. In particular, the LTE base stations can provide a coverage layer for 5G mobile terminals, because they operate on microWave frequencies, which are less sensitive to blockage and have a lower pathloss. This document is a copy of the Master's Thesis carried out by Mr. Michele Polese under the supervision of Dr. Marco Mezzavilla and Prof. Michele Zorzi. It will propose an LTE-5G tight integration architecture, based on mobile terminals' dual connectivity to LTE and 5G radio access networks, and will evaluate which are the new network procedures that will be needed to support it. Moreover, this new architecture will be implemented in the ns-3 simulator, and a thorough simulation campaign will be conducted in order to evaluate its performance, with respect to the baseline of handover between LTE and 5G.Comment: Master's Thesis carried out by Mr. Michele Polese under the supervision of Dr. Marco Mezzavilla and Prof. Michele Zorz

    Real-Time Localization Using Software Defined Radio

    Get PDF
    Service providers make use of cost-effective wireless solutions to identify, localize, and possibly track users using their carried MDs to support added services, such as geo-advertisement, security, and management. Indoor and outdoor hotspot areas play a significant role for such services. However, GPS does not work in many of these areas. To solve this problem, service providers leverage available indoor radio technologies, such as WiFi, GSM, and LTE, to identify and localize users. We focus our research on passive services provided by third parties, which are responsible for (i) data acquisition and (ii) processing, and network-based services, where (i) and (ii) are done inside the serving network. For better understanding of parameters that affect indoor localization, we investigate several factors that affect indoor signal propagation for both Bluetooth and WiFi technologies. For GSM-based passive services, we developed first a data acquisition module: a GSM receiver that can overhear GSM uplink messages transmitted by MDs while being invisible. A set of optimizations were made for the receiver components to support wideband capturing of the GSM spectrum while operating in real-time. Processing the wide-spectrum of the GSM is possible using a proposed distributed processing approach over an IP network. Then, to overcome the lack of information about tracked devices’ radio settings, we developed two novel localization algorithms that rely on proximity-based solutions to estimate in real environments devices’ locations. Given the challenging indoor environment on radio signals, such as NLOS reception and multipath propagation, we developed an original algorithm to detect and remove contaminated radio signals before being fed to the localization algorithm. To improve the localization algorithm, we extended our work with a hybrid based approach that uses both WiFi and GSM interfaces to localize users. For network-based services, we used a software implementation of a LTE base station to develop our algorithms, which characterize the indoor environment before applying the localization algorithm. Experiments were conducted without any special hardware, any prior knowledge of the indoor layout or any offline calibration of the system

    Radio resource allocation for overlay D2D-based vehicular communications in future wireless networks

    Get PDF
    Mobilfunknetze der nächsten Generation ermöglichen einen weitverbreiteten Einsatz von Device-to-Device Kommunikation, der direkten Kommunikation zwischen zellularen Endgeräten. Für viele Anwendungsfälle zur direkten Kommunikation zwischen Endgeräten sind eine deterministische Latenz und die hohe Zuverlässigkeit von zentraler Bedeutung. Dienste zur direkten Kommunikation (D2D) für in der Nähe befindliche Endgeräte sind vielversprechend die hohen Anforderungen an Latenz und Zuverlässigkeit für zukünftige vertikale Anwendungen zu erfüllen. Eine der herausragenden vertikalen Anwendungen ist die Fahrzeugkommunikation, bei der die Fahrzeuge sicherheitskritische Meldungen direkt über D2D-Kommunikation austauschen, die dadurch zur Reduktion von Verkehrsunfällen und gleichzeitig von Todesfällen im Straßenverkehrt beiträgt. Neue Techniken zur effizienteren Zuweisung von Funkressourcen in der D2D-Kommunikation haben in letzter Zeit in Industrie und Wissenschaft große Aufmerksamkeit erlangt. Zusätzlich zur Allokation von Ressourcen, wird die Energieeffizienz zunehmend wichtiger, die normalerweise im Zusammenhang mit der Ressourcenallokation behandelt wird. Diese Dissertation untersucht verschiedener Ansätze der Funkressourcenzuweisung und Energieeffizienztechniken in der LTE und NR V2X Kommunikation. Im Folgenden beschreiben wir kurz die Kernideen der Dissertation. Meist zeichnen sich D2D-Anwendungen durch ein relativ geringes Datenvolumen aus, die über Funkressourcen übertragen werden. In LTE können diese Funkressourcen aufgrund der groben Granularität für die Ressourcenzuweisung nicht effizient genutzt werden. Insbesondere beim semi-persistenten Scheduling, bei dem eine Funkressource über einen längeren Zeitraum im Overlay D2D festgelegt wird, sind die Funkressourcen für solche Anwendungen nicht ausgelastet. Um dieses Problem zu lösen, wird eine hierarchische Form für das Management der Funkressourcen, ein sogenanntes Subgranting-Schema, vorgeschlagen. Dabei kann ein nahegelegener zellularer Nutzer, der sogenannte begünstigte Nutzer, ungenutzten Funkressourcen, die durch Subgranting-Signalisierung angezeigt werden, wiederzuverwenden. Das vorgeschlagene Schema wird bewertet und mit "shortening TTI", einen Schema mit reduzierten Sendezeitintervallen, in Bezug auf den Zellendurchsatz verglichen. Als nächster Schritt wird untersucht, wie der begünstigten Benutzer ausgewählt werden kann und als Maximierungsproblem des Zellendurchsatzes im Uplink unter Berücksichtigung von Zuverlässigkeits- und Latenzanforderungen dargestellt. Dafür wird ein heuristischer zentralisierter, d.h. dedizierter Sub-Granting-Radio-Ressource DSGRR-Algorithmus vorgeschlagen. Die Simulationsergebnisse und die Analyse ergeben in einem Szenario mit stationären Nutzern eine Erhöhung des Zelldurchsatzes bei dem Einsatz des vorgeschlagenen DSGRR-Algorithmus im Vergleich zu einer zufälligen Auswahl von Nutzern. Zusätzlich wird das Problem der Auswahl des begünstigten Nutzers in einem dynamischen Szenario untersucht, in dem sich alle Nutzer bewegen. Wir bewerten den durch das Sub-Granting durch die Mobilität entstandenen Signalisierungs-Overhead im DSGRR. Anschließend wird ein verteilter Heuristik-Algorithmus (OSGRR) vorgeschlagen und sowohl mit den Ergebnissen des DSGRR-Algorithmus als auch mit den Ergebnissen ohne Sub-Granting verglichen. Die Simulationsergebnisse zeigen einen verbesserten Zellendurchsatz für den OSGRR im Vergleich zu den anderen Algorithmen. Außerdem ist zu beobachten, dass der durch den OSGRR entstehende Overhead geringer ist als der durch den DSGRR, während der erreichte Zellendurchsatz nahe am maximal erreichbaren Uplink-Zellendurchsatz liegt. Zusätzlich wird die Ressourcenallokation im Zusammenhang mit der Energieeffizienz bei autonomer Ressourcenauswahl in New Radio (NR) Mode 2 untersucht. Die autonome Auswahl der Ressourcen wird als Verhältnis von Summenrate und Energieverbrauch formuliert. Das Ziel ist den Stromverbrauch der akkubetriebenen Endgeräte unter Berücksichtigung der geforderten Zuverlässigkeit und Latenz zu minimieren. Der heuristische Algorithmus "Density of Traffic-based Resource Allocation (DeTRA)" wird als Lösung vorgeschlagen. Bei dem vorgeschlagenen Algorithmus wird der Ressourcenpool in Abhängigkeit von der Verkehrsdichte pro Verkehrsart aufgeteilt. Die zufällige Auswahl erfolgt zwingend auf dem dedizierten Ressourcenpool beim Eintreffen aperiodischer Daten. Die Simulationsergebnisse zeigen, dass der vorgeschlagene Algorithmus die gleichen Ergebnisse für die Paketempfangsrate (PRR) erreicht, wie der sensing-basierte Algorithmus. Zusätzlich wird der Stromverbrauch des Endgeräts reduziert und damit die Energieeffizienz durch die Anwendung des DeTRA-Algorithmus verbessert. In dieser Arbeit werden Techniken zur Allokation von Funkressourcen in der LTE-basierten D2D-Kommunikation erforscht und eingesetzt, mit dem Ziel Funkressourcen effizienter zu nutzen. Darüber hinaus ist der in dieser Arbeit vorgestellte Ansatz eine Basis für zukünftige Untersuchungen, wie akkubasierte Endgeräte mit minimalem Stromverbrauch in der NR-V2X-Kommunikation Funkressourcen optimal auswählen können.Next-generation cellular networks are envisioned to enable widely Device-to-Device (D2D) communication. For many applications in the D2D domain, deterministic communication latency and high reliability are of exceptionally high importance. The proximity service provided by D2D communication is a promising feature that can fulfil the reliability and latency requirements of emerging vertical applications. One of the prominent vertical applications is vehicular communication, in which the vehicles disseminate safety messages directly through D2D communication, resulting in the fatality rate reduction due to a possible collision. Radio resource allocation techniques in D2D communication have recently gained much attention in industry and academia, through which valuable radio resources are allocated more efficiently. In addition to the resource allocation techniques, energy sustainability is highly important and is usually considered in conjunction with the resource allocation approach. This dissertation is dedicated to studying different avenues of the radio resource allocation and energy efficiency techniques in Long Term Evolution (LTE) and New Radio (NR) Vehicle-to-Everythings (V2X) communications. In the following, we briefly describe the core ideas in this study. Mostly, the D2D applications are characterized by relatively small traffic payload size, and in LTE, due to coarse granularity of the subframe, the radio resources can not be utilized efficiently. Particularly, in the case of semi-persistent scheduling when a radio resource is scheduled for a longer time in the overlay D2D, the radio resources are underutilized for such applications. To address this problem, a hierarchical radio resource management scheme, i.e., a sub-granting scheme, is proposed by which nearby cellular users, i.e., beneficiary users, are allowed to reuse the unused radio resource indicated by sub-granting signaling. The proposed scheme is evaluated and compared with shortening Transmission Time Interval (TTI) schemes in terms of cell throughput. Then, the beneficiary user selection problem is investigated and is cast as a maximization problem of uplink cell throughput subject to reliability and latency requirements. A heuristic centralized, i.e., dedicated sub-granting radio resource Dedicated Sub-Granting Radio Resource (DSGRR) algorithm is proposed to address the original beneficiary user selection problem. The simulation results and analysis show the superiority of the proposed DSGRR algorithm over the random beneficiary user selection algorithm in terms of the cell throughput in a scenario with stationary users. Further, the beneficiary user selection problem is investigated in a scenario where all users are moving in a dynamic environment. We evaluate the sub-granting signaling overhead due to mobility in the DSGRR, and then a distributed heuristics algorithm, i.e., Open Sub-Granting Radio Resource (OSGRR), is proposed and compared with the DSGRR algorithm and no sub-granting case. Simulation results show improved cell throughput for the OSGRR compared with other algorithms. Besides, it is observed that the overhead incurred by the OSGRR is less than the DSGRR while the achieved cell throughput is yet close to the maximum achievable uplink cell throughput. Also, joint resource allocation and energy efficiency in autonomous resource selection in NR, i.e. Mode 2, is examined. The autonomous resource selection is formulated as a ratio of sum-rate and energy consumption. The objective is to minimize the energy efficiency of the power-saving users subject to reliability and latency requirements. A heuristic algorithm, density of traffic-based resource allocation (DeTRA), is proposed to solve the problem. The proposed algorithm splits the resource pool based on the traffic density per traffic type. The random selection is then mandated to be performed on the dedicated resource pool upon arrival of the aperiodic traffic is triggered. The simulation results show that the proposed algorithm achieves the same packet reception ratio (PRR) value as the sensing-based algorithm. In addition, per-user power consumption is reduced, and consequently, the energy efficiency is improved by applying the DeTRA algorithm. The research in this study leverages radio resource allocation techniques in LTE based D2D communications to be utilized radio resources more efficiently. In addition, the conducted research paves a way to study further how the power-saving users would optimally select the radio resources with minimum energy consumption in NR V2X communications

    Towards Scalable Design of Future Wireless Networks

    Full text link
    Wireless operators face an ever-growing challenge to meet the throughput and processing requirements of billions of devices that are getting connected. In current wireless networks, such as LTE and WiFi, these requirements are addressed by provisioning more resources: spectrum, transmitters, and baseband processors. However, this simple add-on approach to scale system performance is expensive and often results in resource underutilization. What are, then, the ways to efficiently scale the throughput and operational efficiency of these wireless networks? To answer this question, this thesis explores several potential designs: utilizing unlicensed spectrum to augment the bandwidth of a licensed network; coordinating transmitters to increase system throughput; and finally, centralizing wireless processing to reduce computing costs. First, we propose a solution that allows LTE, a licensed wireless standard, to co-exist with WiFi in the unlicensed spectrum. The proposed solution bridges the incompatibility between the fixed access of LTE, and the random access of WiFi, through channel reservation. It achieves a fair LTE-WiFi co-existence despite the transmission gaps and unequal frame durations. Second, we consider a system where different MIMO transmitters coordinate to transmit data of multiple users. We present an adaptive design of the channel feedback protocol that mitigates interference resulting from the imperfect channel information. Finally, we consider a Cloud-RAN architecture where a datacenter or a cloud resource processes wireless frames. We introduce a tree-based design for real-time transport of baseband samples and provide its end-to-end schedulability and capacity analysis. We also present a processing framework that combines real-time scheduling with fine-grained parallelism. The framework reduces processing times by migrating parallelizable tasks to idle compute resources, and thus, decreases the processing deadline-misses at no additional cost. We implement and evaluate the above solutions using software-radio platforms and off-the-shelf radios, and confirm their applicability in real-world settings.PhDElectrical Engineering: SystemsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/133358/1/gkchai_1.pd

    Uplink data measurement and analysis for 5G eCPRI radio unit

    Get PDF
    Abstract. The new 5G mobile network generation aims to enhance the performance of the cellular network in almost every possible aspect, offering higher data rates, lower latencies, and massive number of network connections. Arguably the most important change from LTE are the new RU-BBU split options for 5G promoted by 3GPP and other organizations. Another big conceptual shift introduced with 5G is the open RAN concept, pushed forward by organizations such as the O-RAN alliance. O-RAN aims to standardize the interfaces between different RAN elements in a way that promotes vendor interoperability and lowers the entry barrier for new equipment suppliers. Moreover, the 7-2x split option standardized by O-RAN has risen as the most important option within the different low layer split options. As the fronthaul interface, O-RAN has selected the packet-based eCPRI protocol, which has been designed to be more flexible and dynamic in terms of transport network and data-rates compared to its predecessor CPRI. Due to being a new interface, tools to analyse data from this interface are lacking. In this thesis, a new, Python-based data analysis tool for UL eCPRI data was created for data quality validation purposes from any O-RAN 7-2x functional split based 5G eCPRI radio unit. The main goal for this was to provide concrete KPIs from captured data, including timing offset, signal power level and error vector magnitude. The tool produces visual and text-based outputs that can be used in both manual and automated testing. The tool has enhanced eCPRI UL datapath testing in radio unit integration teams by providing actual quality metrics and enabling test automation.Uplink datamittaukset ja -analyysi 5G eCPRI radiolla. Tiivistelmä. Uusi 5G mobiiliverkkogeneraatio tuo mukanaan parannuksia lähes kaikkiin mobiiliverkon ominaisuuksiin, tarjoten nopeamman datasiirron, pienemmät viiveet ja valtavat laiteverkostot. Luultavasti tärkein muutos LTE teknologiasta ovat 3GPP:n ja muiden organisaatioiden ehdottamat uudet radion ja systeemimoduulin väliset funktionaaliset jakovaihtoehdot. Toinen huomattava muutos 5G:ssä on O-RAN:in ajama avoimen RAN:in konsepti, jonka tarkoituksena on standardisoida verkkolaitteiden väliset rajapinnat niin, että RAN voidaan rakentaa eri valmistajien laitteista, laskien uusien laitevalmistajien kynnystä astua verkkolaitemarkkinoille. O-RAN:n standardisoima 7-2x funktionaalinen jako on noussut tärkeimmäksi alemman tason jakovaihtoehdoista. Fronthaul rajapinnan protokollaksi O-RAN on valinnut pakettitiedonsiirtoon perustuvan eCPRI:n, joka on suunniteltu dynaamisemmaksi ja joustavammaksi datanopeuksien ja lähetysverkon suhteen kuin edeltävä CPRI protokolla. Uutena protokollana, eCPRI rajapinnalle soveltuvia data-analyysityökaluja ei ole juurikaan saatavilla. Tässä työssä luotiin uusi pythonpohjainen data-analyysityökalu UL suunnan eCPRI datalle, jotta datan laatu voidaan määrittää millä tahansa O-RAN 7-2x funktionaaliseen jakoon perustuvalla 5G eCPRI radiolla. Työkalun päätarkoitus on analysoida ja kuvata datan laatua laskemalla datan ajoitusoffsettia, tehotasoa, sekä EVM:ää. Työkalu tuottaa tulokset visuaalisena ja tekstipohjaisena, jotta analyysia voidaan tehdä niin manuaalisessa kuin automaattisessa testauksessa. Työkalun käyttöönotto on tehostanut UL suunnan dataputken testausta radio-integrointitiimeissä, tarjoten datan laatua kuvaavaa metriikkaa sekä mahdollistaen testauksen automatisoinnin

    Performance Comparison of Dual Connectivity and Hard Handover for LTE-5G Tight Integration in mmWave Cellular Networks

    Get PDF
    This Thesis will propose an LTE-5G tight integration architecture, based on mobile terminals' dual connectivity to LTE and 5G radio access networks, and will evaluate which are the new network procedures that will be needed to support it. Moreover, this new architecture will be implemented in the ns-3 simulator, and a thorough simulation campaign will be conducted in order to evaluate its performance, with respect to the baseline of handover between LTE and 5
    • …
    corecore