1,674 research outputs found

    Bayesian Learning Strategies in Wireless Networks

    Get PDF
    This thesis collects the research works I performed as a Ph.D. candidate, where the common thread running through all the works is Bayesian reasoning with applications in wireless networks. The pivotal role in Bayesian reasoning is inference: reasoning about what we don’t know, given what we know. When we make inference about the nature of the world, then we learn new features about the environment within which the agent gains experience, as this is what allows us to benefit from the gathered information, thus adapting to new conditions. As we leverage the gathered information, our belief about the environment should change to reflect our improved knowledge. This thesis focuses on the probabilistic aspects of information processing with applications to the following topics: Machine learning based network analysis using millimeter-wave narrow-band energy traces; Bayesian forecasting and anomaly detection in vehicular monitoring networks; Online power management strategies for energy harvesting mobile networks; Beam training and data transmission optimization in millimeter-wave vehicular networks. In these research works, we deal with pattern recognition aspects in real-world data via supervised/unsupervised learning methods (classification, forecasting and anomaly detection, multi-step ahead prediction via kernel methods). Finally, the mathematical framework of Markov Decision Processes (MDPs), which also serves as the basis for reinforcement learning, is introduced, where Partially Observable MDPs use the notion of belief to make decisions about the state of the world in millimeter-wave vehicular networks. The goal of this thesis is to investigate the considerable potential of inference from insightful perspectives, detailing the mathematical framework and how Bayesian reasoning conveniently adapts to various research domains in wireless networks

    Contextual Beamforming: Exploiting Location and AI for Enhanced Wireless Telecommunication Performance

    Full text link
    The pervasive nature of wireless telecommunication has made it the foundation for mainstream technologies like automation, smart vehicles, virtual reality, and unmanned aerial vehicles. As these technologies experience widespread adoption in our daily lives, ensuring the reliable performance of cellular networks in mobile scenarios has become a paramount challenge. Beamforming, an integral component of modern mobile networks, enables spatial selectivity and improves network quality. However, many beamforming techniques are iterative, introducing unwanted latency to the system. In recent times, there has been a growing interest in leveraging mobile users' location information to expedite beamforming processes. This paper explores the concept of contextual beamforming, discussing its advantages, disadvantages and implications. Notably, the study presents an impressive 53% improvement in signal-to-noise ratio (SNR) by implementing the adaptive beamforming (MRT) algorithm compared to scenarios without beamforming. It further elucidates how MRT contributes to contextual beamforming. The importance of localization in implementing contextual beamforming is also examined. Additionally, the paper delves into the use of artificial intelligence schemes, including machine learning and deep learning, in implementing contextual beamforming techniques that leverage user location information. Based on the comprehensive review, the results suggest that the combination of MRT and Zero forcing (ZF) techniques, alongside deep neural networks (DNN) employing Bayesian Optimization (BO), represents the most promising approach for contextual beamforming. Furthermore, the study discusses the future potential of programmable switches, such as Tofino, in enabling location-aware beamforming

    Max-Min Fair Resource Allocation in Millimetre-Wave Backhauls

    Get PDF
    5G mobile networks are expected to provide pervasive high speed wireless connectivity, to support increasingly resource intensive user applications. Network hyper-densification therefore becomes necessary, though connecting to the Internet tens of thousands of base stations is non-trivial, especially in urban scenarios where optical fibre is difficult and costly to deploy. The millimetre wave (mm-wave) spectrum is a promising candidate for inexpensive multi-Gbps wireless backhauling, but exploiting this band for effective multi-hop data communications is challenging. In particular, resource allocation and scheduling of very narrow transmission/ reception beams requires to overcome terminal deafness and link blockage problems, while managing fairness issues that arise when flows encounter dissimilar competition and traverse different numbers of links with heterogeneous quality. In this paper, we propose WiHaul, an airtime allocation and scheduling mechanism that overcomes these challenges specific to multi-hop mm-wave networks, guarantees max-min fairness among traffic flows, and ensures the overall available backhaul resources are fully utilised. We evaluate the proposed WiHaul scheme over a broad range of practical network conditions, and demonstrate up to 5 times individual throughput gains and a fivefold improvement in terms of measurable fairness, over recent mm-wave scheduling solutions

    Technologies for injection molded antennas for mass production

    Get PDF
    Tesi en modalitat de compendi de publicacions. In reference to IEEE copyrighted material which is used with permission in this thesis, the IEEE does not endorse any of Universitat Politècnica de Catalunya's products or services. Internal or personal use of this material is permitted. If interested in reprinting/republishing IEEE copyrighted material for advertising or promotional purposes or for creating new collective works for resale or redistribution, please go to http://www.ieee.org/publications_standards/publications/rights/rights_link.html to learn how to obtain a License from RightsLink.(English) The deployment of 5G antenna infrastructure and the mandatory adoption of anti-collision radars for automotive cars will require large amount of antennas operating in the millimeter and sub-millimeter wavelength. These antennas are usually arrays and the possibility to manufacture the antenna array including the feeding network and the radiating element as a plastic piece reducing the need to use large (Printed Circuit Boards) PCB’s on expensive dielectric substrates, can be an interesting manufacturing technology. In this regard, waveguide-based antennas can be assembled using plastic technology with a proper metallization procedure. They are more scalable in terms of efficiency than microstrip line (ML) antennas and as the number of antennas in the array increases the gain is not reduced due to the losses in the substrate. In this thesis, the industrial challenges of this technology are addressed. A detailed tolerance study by including the plastic manufacturing errors, typically +-0.1mm, is carried out in order to check the feasibility of plastic antennas to address mass production. The antennas will need to be integrated with the radar chipsets, so a transition between the chip and the waveguide-antennas will be presented. These transitions can act as a direct chip-waveguide launcher, potentially reducing the need of using large substrates, hence reducing the cost of the antenna. Also, the need to apply metal coating is also explored to achieve the desired performance. Conventional techniques such as copper electrodeposition is used. The main drawback is that the copper has a lot of difficulties depositing into right angle surfaces. Eventually, these antennas will have to be integrated in the aesthetics of a car, usually behind a plastic radome (with its respective manufacturing errors as well) that will need to be designed and optimized properly in order to introduce the minimum distorsions to the radar. Optimization based on simulators done with commercial electromagnetic softwares like CST is not feasible due to the required large computation time. In this regard an ad-hoc ray-tracing based simulator has been developed to asses radome induced errors in radar performance. All these industrial problems are taken into account from the design stage where the time, price, fabrication tolerances and radiation requirements have to be compromised at the same time increasing dramatically the design complexity.(Español) El despliegue de infraestructura de antenas 5G y la adopción obligatoria de radares anticolisión para automóviles requerirá una gran cantidad de antenas que operen en longitudes de onda milimétricas y submilimétricas. Estas antenas suelen ser agrupaciones y la posibilidad de fabricar la agrupación de antenas, incluida la red de alimentación y el elemento radiante como una pieza de plástico, lo que reduce la necesidad de usar PCB grandes (placas de circuito impreso) en sustratos dieléctricos costosos, puede ser una tecnología de fabricación interesante. En este sentido, las antenas basadas en guía de ondas se pueden ensamblar utilizando tecnología plástica con un procedimiento de metalización adecuado. Son más escalables en términos de eficiencia que las antenas de línea microstrip (ML) y, a medida que aumenta el número de antenas en el arreglo, la ganancia no se reduce debido a las pérdidas en el sustrato. En esta tesis se abordan los retos industriales de esta tecnología. Se lleva a cabo un estudio de tolerancia detallado que incluye los errores de fabricación de plástico, normalmente +- 0,1 mm, para comprobar la viabilidad de las antenas de plástico para hacer frente a la producción en masa. Las antenas deberán integrarse junto con los chips de radar, por lo que se presentará una transición entre el chip y las antenas de guía de ondas. Estas transiciones pueden actuar como una transición directa de chip-guía, lo que podría reducir la necesidad de usar sustratos grandes y, por lo tanto, reducir el costo de la antena. Además, también se explora la necesidad de aplicar un recubrimiento metálico para lograr el rendimiento deseado. Se utilizan técnicas convencionales como la electrodeposición de cobre. El principal inconveniente es que el cobre tiene muchas dificultades para depositarse en superficies en ángulo recto. Eventualmente, estas antenas deberán integrarse en la estética de un automóvil, generalmente detrás de un radomo de plástico (con sus respectivos errores de fabricación también) que deberá diseñarse y optimizarse adecuadamente para introducir las mínimas distorsiones al radar. La optimización basada en simuladores realizados con software electromagnético comercial como CST no es factible debido al gran tiempo de cálculo requerido. En este sentido, se ha desarrollado un simulador basado en trazado de rayos ad-hoc para evaluar los errores inducidos por el radomo en el rendimiento del radar. Todos estos problemas industriales se tienen en cuenta desde la etapa de diseño donde el tiempo, el precio, las tolerancias de fabricación y los requisitos de radiación tienen que verse comprometidos al mismo tiempo que aumentan drásticamente la complejidad del diseño.Postprint (published version

    Analysis and performance improvement of consumer-grade millimeter wave wireless networks

    Get PDF
    Millimeter-wave (mmWave) networks are one of the main key components in next cellular and WLANs (Wireless Local Area Networks). mmWave networks are capable of providing multi gigabit-per-second rates with very directional low-interference and high spatial reuse links. In 2013, the first 60 GHz wireless solution for WLAN appeared in the market. These were wireless docking stations under theWiGig protocol. Today, in 2019, 60 GHz communications have gained importance with the IEEE 802.11ad amendment with different products on the market, including routers, laptops and wireless Ethernet solutions. More importantly, mmWave networks are going to be used in next generation cellular networks, where smartphones will be using the 28 GHz band. For backbone links, 60 GHz communications have been proposed due to its higher directionality and unlicensed use. This thesis fits in this frame of constant development of themmWave bands to meet the needs of latency and throughput that will be necessary to support future communications. In this thesis, we first characterize the cost-effective design of COTS (commercial off-the-shelf) 60 GHz devices and later we improve their two main weaknesses, which are their low link distance and their non-ideal spatial reuse. It is critical to take into consideration the cost-effective design of COTS devices when designing networking mechanisms. This is why in this thesis we do the first-of-its-kind COTS analysis of 60 GHz devices, studying the D5000 WiGig Docking station and the TP-Link Talon IEEE 802.11ad router. We include static measurements such as the synthesized beam patterns of these devices or an analysis of the area-wide coverage that these devices can fulfill. We perform a spatial reuse analysis and study the performance of these devices under user mobility, showing how robust the link can be under user movement. We also study the feasibility of having flying mmWave links. We mount a 60 GHz COTS device into a drone and perform different measurement campaigns. In this first analysis, we see that these 60 GHz devices have a large performance gap for the achieved communication range as well as a very low spatial reuse. However, they are still suitable for low density WLANs and for next generation aerial micro cell stations. Seeing that these COTS devices are not as directional as literature suggests, we analyze how channels are not as frequency stable as expected due to the large amount of reflected signals. Ideally, frequency selective techniques could be used in these frequency selective channels in order to enlarge the range of these 60 GHz devices. To validate this, we measure real-world 60 GHz indoor channels with a bandwidth of 2 GHz and study their behavior with respect to techniques such as bitloading, subcarrier switch-off, and waterfilling. To this end, we consider a Orthogonal Frequency-Division Multiplexing (OFDM) channel as defined in the IEEE 802.11ad standard and show that in point of fact, these techniques are highly beneficial in mmWave networks allowing for a range extension of up to 50%, equivalent to power savings of up to 7 dB. In order to increase the very limited spatial reuse of these wireless networks, we propose a centralized system that allows the network to carry out the beam training process not only to maximize power but also taking into account other stations in order to minimize interference. This system is designed to work with unmodified clients. We implement and validate our system on commercial off-the-shelf IEEE 802.11ad hardware, achieving an average throughput gain of 24.67% for TCP traffic, and up to a twofold throughput gain in specific cases.Programa de Doctorado en Multimedia y Comunicaciones por la Universidad Carlos III de Madrid y la Universidad Rey Juan CarlosPresidente: Andrés García Saavedra.- Secretario: Matilde Pilar Sánchez Fernández.- Vocal: Ljiljana Simi

    Hybrid Satellite-Terrestrial Communication Networks for the Maritime Internet of Things: Key Technologies, Opportunities, and Challenges

    Get PDF
    With the rapid development of marine activities, there has been an increasing number of maritime mobile terminals, as well as a growing demand for high-speed and ultra-reliable maritime communications to keep them connected. Traditionally, the maritime Internet of Things (IoT) is enabled by maritime satellites. However, satellites are seriously restricted by their high latency and relatively low data rate. As an alternative, shore & island-based base stations (BSs) can be built to extend the coverage of terrestrial networks using fourth-generation (4G), fifth-generation (5G), and beyond 5G services. Unmanned aerial vehicles can also be exploited to serve as aerial maritime BSs. Despite of all these approaches, there are still open issues for an efficient maritime communication network (MCN). For example, due to the complicated electromagnetic propagation environment, the limited geometrically available BS sites, and rigorous service demands from mission-critical applications, conventional communication and networking theories and methods should be tailored for maritime scenarios. Towards this end, we provide a survey on the demand for maritime communications, the state-of-the-art MCNs, and key technologies for enhancing transmission efficiency, extending network coverage, and provisioning maritime-specific services. Future challenges in developing an environment-aware, service-driven, and integrated satellite-air-ground MCN to be smart enough to utilize external auxiliary information, e.g., sea state and atmosphere conditions, are also discussed

    LiDAR aided simulation pipeline for wireless communication in vehicular traffic scenarios

    Get PDF
    Abstract. Integrated Sensing and Communication (ISAC) is a modern technology under development for Sixth Generation (6G) systems. This thesis focuses on creating a simulation pipeline for dynamic vehicular traffic scenarios and a novel approach to reducing wireless communication overhead with a Light Detection and Ranging (LiDAR) based system. The simulation pipeline can be used to generate data sets for numerous problems. Additionally, the developed error model for vehicle detection algorithms can be used to identify LiDAR performance with respect to different parameters like LiDAR height, range, and laser point density. LiDAR behavior on traffic environment is provided as part of the results in this study. A periodic beam index map is developed by capturing antenna azimuth and elevation angles, which denote maximum Reference Signal Receive Power (RSRP) for a simulated receiver grid on the road and classifying areas using Support Vector Machine (SVM) algorithm to reduce the number of Synchronization Signal Blocks (SSBs) that are needed to be sent in Vehicle to Infrastructure (V2I) communication. This approach effectively reduces the wireless communication overhead in V2I communication

    The First Multichroic Receiver and Results from ACTPol.

    Full text link
    The Cosmic Microwave Background (CMB) is a unique and powerful tool for the study of cosmology and fundamental physics. The next frontier of CMB research is to extract the wealth of cosmological information available from its polarization. Accurate measurement of this polarization signal will enable us to probe inflation, provide an alternative means to measure the neutrino mass sum and number of neutrino species; improve our understanding of dark energy; explore the reionization history of our Universe; probe the large scale structure through gravitational lensing; and enable a multitude of other astrophysical studies. The polarized signatures of the early universe are extremely weak, dominated by foregrounds, and its measurement is susceptible to instrumental effects. Extracting the information contained in these faint signals requires instruments with high sensitivity, excellent control over systematic errors, and careful data analysis. The Atacama Cosmology Telescope Polarimeter (ACTPol) is a state-of-the-art experiment that measures CMB polarization over finer angular scales from the Atacama desert in Chile. In this thesis, I present an overview of this project and then describe my work on the project including development of a new polarization sensitive dichroic camera for ACTPol designed to increase the sensitivity of CMB telescopes and enable high precision measurements of CMB polarization; the development of novel metamaterial antireflection coatings for silicon lenses; diffraction from panel gaps; calibration of detector pass-bands; and a detailed description of my analysis of the polarization properties of extragalactic point sources discovered with the ACTPol data. I conclude with a discussion of the science of ACTPol, and the impact of my technical work on future CMB experiments.PHDPhysicsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/135767/1/dattar_1.pd

    A Modern Method to Improve of Detecting and Categorizing Mechanism for Micro Seismic Events Data Using Boost Learning System

    Get PDF
    Various natural disasters such as floods, fires, earthquakes, etc. have affected human life. Detection and classification of large and small earthquakes caused by natural or abnormal events have been always important to Earth scientist. One of the most important research challenges in this field is the lack of an effective method for identifying and categorizing various types of seismic events at less important and important levels. Based on latest achievements of Data Mining international institutions such as Rexer-KDnugget-Gartner and also newest authentic articles, SVM, KNN, C4.5, MLP are from most important and popular and leading classifiers in data world. Therefor in present study, a boost learning system consisting support vector machine algorithms with linear regression, MLP Neural Network ، C4.5 decision tree and KNN near neighbourhood have been utilized in a combined form to detect and categorize micro seismic events. In general, the steps involved in the proposed method are: 1) performing artificial seismic tests, 2) data gathering and analysis, 3) conducting pre-processing and separating training and testing samples, 4) generating relevant models with training samples and detecting and clustering test samples and 5) extracting a cluster with the maximum candidate using boost learning. After simulations, it was observed that the accuracy of proposed boost method to the best answer was about 6.1% higher compare to other methods and the error rate was 0.082% of recalling. Accuracy of detection and classification to the best answer were also improved compare to other methods up to 2.31% and 6.34%, respectively
    • …
    corecore