166 research outputs found

    Design and Performance Analysis of Next Generation Heterogeneous Cellular Networks for the Internet of Things

    Get PDF
    The Internet of Things (IoT) is a system of inter-connected computing devices, objects and mechanical and digital machines, and the communications between these devices/objects and other Internet-enabled systems. Scalable, reliable, and energy-efficient IoT connectivity will bring huge benefits to the society, especially in transportation, connected self-driving vehicles, healthcare, education, smart cities, and smart industries. The objective of this dissertation is to model and analyze the performance of large-scale heterogeneous two-tier IoT cellular networks, and offer design insights to maximize their performance. Using stochastic geometry, we develop realistic yet tractable models to study the performance of such networks. In particular, we propose solutions to the following research problems: -We propose a novel analytical model to estimate the mean uplink device data rate utility function under both spectrum allocation schemes, full spectrum reuse (FSR) and orthogonal spectrum partition (OSP), for uplink two-hop IoT networks. We develop constraint gradient ascent optimization algorithms to obtain the optimal aggregator association bias (for the FSR scheme) and the optimal joint spectrum partition ratio and optimal aggregator association bias (for the OSP scheme). -We study the performance of two-tier IoT cellular networks in which one tier operates in the traditional sub-6GHz spectrum and the other, in the millimeter wave (mm-wave) spectrum. In particular, we characterize the meta distributions of the downlink signal-to-interference ratio (sub-6GHz spectrum), the signal-to-noise ratio (mm-wave spectrum) and the data rate of a typical device in such a hybrid spectrum network. Finally, we characterize the meta distributions of the SIR/SNR and data rate of a typical device by substituting the cumulative moment of the CSP of a user device into the Gil-Pelaez inversion theorem. -We propose to split the control plane (C-plane) and user plane (U-plane) as a potential solution to harvest densification gain in heterogeneous two-tier networks while minimizing the handover rate and network control overhead. We develop a tractable mobility-aware model for a two-tier downlink cellular network with high density small cells and a C-plane/U-plane split architecture. The developed model is then used to quantify effect of mobility on the foreseen densification gain with and without C-plane/U-plane splitting

    Proactive Received Power Prediction Using Machine Learning and Depth Images for mmWave Networks

    Full text link
    This study demonstrates the feasibility of the proactive received power prediction by leveraging spatiotemporal visual sensing information toward the reliable millimeter-wave (mmWave) networks. Since the received power on a mmWave link can attenuate aperiodically due to a human blockage, the long-term series of the future received power cannot be predicted by analyzing the received signals before the blockage occurs. We propose a novel mechanism that predicts a time series of the received power from the next moment to even several hundred milliseconds ahead. The key idea is to leverage the camera imagery and machine learning (ML). The time-sequential images can involve the spatial geometry and the mobility of obstacles representing the mmWave signal propagation. ML is used to build the prediction model from the dataset of sequential images labeled with the received power in several hundred milliseconds ahead of when each image is obtained. The simulation and experimental evaluations using IEEE 802.11ad devices and a depth camera show that the proposed mechanism employing convolutional LSTM predicted a time series of the received power in up to 500 ms ahead at an inference time of less than 3 ms with a root-mean-square error of 3.5 dB

    Implementação e avaliação no system generator de um sistema cooperativo para os futuros sistemas 5G

    Get PDF
    With the arrival of 5G it is expected the proliferation of services in the different fields such as healthcare, utility applications, industrial automation, 4K streaming, that the former networks can not provide. Additionally, the total number of wireless communication devices will escalate in such a manner that the already scarce available frequency bandwidth won’t be enough to pack the intended objectives. Cisco’s Annual Internet Report from 2018 predicts that by 2023 there will be nearly 30 billion devices capable of wireless communication. Due to the exponential expiation of both services and devices, the challenges upon both network data capacity and efficient radio resourse use will be greater than ever, thus the urgency for solutions is grand. Both the capacity for wireless communications and spectral efficiency are related to cell size and its users proximity to the access point. Thus, shortening the distance between the transmitter and the receiver improves both aspects of the network. This concept is what motivates the implementation of heterogeneous networks, HetNets, that are composed of many different small-cells, SCs, overlaid across the same coexisting area of a conventional macro-cell, shortening the distance between the cell users and its access point transceivers, granting a better coverage and higher data rates. However, the HetNets potential does not come without any challenges, as these networks suffer considerably from communication interference between cells. Although some interference management algorithms that allow coexistence between cells have been proposed in recent years, most of them were evaluated by software simulations and not implemented in real-time platforms. Therefore, this master thesis aims to give the first step on the implementation and evaluation of an interference mitigation technique in hardware. Specifically, it is assumed a downlink scenario composed by a macro-cell base station, a macro-cell primary user and a small cell user, with the aim of implementing an algorithm that eliminates the downlink interference that the base station may cause to the secondary users. The study was carried out using the System Generator DSP tool, which is a tool that generates code for hardware from schematics created in it. This tool also offers a wide range of blocks that help the creation, and fundamentally, the simulation and study of the system to be implemented, before being translated into hardware. The results obtained in this work are a faithful representation of the behavior of the implemented system, which can be used for a future application for FPGA.Com a chegada do 5G, espera-se a proliferação de serviços nas mais diversas áreas tal como assistência médica, automação industrial, transmissão em 4k, que não eram possíveis nas redes das gerações anteriores. Além deste fenómeno, o número total de dispositivos capazes de conexões wireless aumentará de tal maneira que a escassa largura de banda disponível não será suficiente para abranger os objetivos pretendidos. O Relatório Anual de 2018 sobre a Internet da Cisco prevê que até 2023 haverá quase 30 bilhões de dispositivos capazes de comunicação sem fio. Devido ao aumento exponencial de serviços e dispositivos, os desafios sobre a capacidade de dados da rede e o udo eficiente dos recursos de rádio serão maiores que nunca. Por estes motivos, a necessidade de soluções para estas lacunas é enorme. Tanto a capacidade da rede e o uso eficiente do espectro de frequências estão relacionados ao tamanho da célula e à proximidade dos usuários com o ponto de acesso da célula. Ao encurtar a distância entre o transmissor e o recetor ocorre um melhoramento destes dois aspetos da rede. Este é o principal conceito na implementação de redes heterogéneas, HetNets, que são compostas por diversas células pequenas que coexistem na área de uma macro célula convencional, diminuído a distância entre os utilizadores da célula e os pontos de acesso, garantindo uma melhor cobertura e taxa de dados mais elevadas. No entanto, o potencial das HatNets não vem sem nenhum custo, pois estas redes sofrem consideravelmente de interferência entre as células. Embora nos últimos anos foram propostos alguns algoritmos que permitem a coexistência das células, a maioria destes foi só testado em simulações de software e não em plataformas em tempo real. Por esse motivo, esta dissertação de mestrado visa dar o primeiro passo na implementação e a avaliação de uma técnica de mitigação de interferência em hardware. Mais especificamente no cenário de downlink entre uma estação base de uma macro célula, um utilizador primário da macro célula e um utilizador secundário de uma célula pequena, com o principal objetivo de cancelar a interferência que a estação base possa fazer ao utilizador secundário. O estudo foi realizado utilizando a ferramenta System Generator DSP, que é uma ferramenta que gera código para hardware a partir de esquemáticos criados na mesma. Esta ferramenta também oferece uma vasta gama de blocos que ajudam a criação, e fundamentalmente, a simulação e o estudo do sistema a implementar antes de ser traduzido para hardware. Os resultados obtidos neste trabalho são uma fiel representação do comportamento do sistema implementado. O quais podem ser utilizados para uma futura aplicação para FPGA.Mestrado em Engenharia Eletrónica e Telecomunicaçõe

    Performance Analysis and Learning Algorithms in Advanced Wireless Networks

    Get PDF
    Over the past decade, wireless data traffic has experienced an exponential growth, especially with multimedia traffic becoming the dominant traffic, and such growth is expected to continue in the near future. This unprecedented growth has led to an increasing demand for high-rate wireless communications.Key solutions for addressing such demand include extreme network densification with more small-cells, the utilization of high frequency bands, such as the millimeter wave (mmWave) bands and terahertz (THz) bands, where more bandwidth is available, and unmanned aerial vehicle (UAV)-enabled cellular networks. With this motivation, different types of advanced wireless networks are considered in this thesis. In particular, mmWave cellular networks, networks with hybrid THz, mmWave and microwave transmissions, and UAV-enabled networks are studied, and performance metrics such as the signal-to-interference-plus-noise ratio (SINR) coverage, energy coverage, and area spectral efficiency are analyzed. In addition, UAV path planning in cellular networks are investigated, and deep reinforcement learning (DRL) based algorithms are proposed to find collision-free UAV trajectory to accomplish different missions. In the first part of this thesis, mmWave cellular networks are considered. First, K-tier heterogeneous mmWave cellular networks with user-centric small-cell deployments are studied. Particularly, a heterogeneous network model with user equipments (UEs) being distributed according to Poisson cluster processes (PCPs) is considered. Distinguishing features of mmWave communications including directional beamforming and a detailed path loss model are taken into account. General expressions for the association probabilities of different tier base stations (BSs) are determined. Using tools from stochastic geometry, the Laplace transform of the interference is characterized and general expressions for the SINR coverage probability and area spectral efficiency are derived. Second, a distributed multi-agent learning-based algorithm for beamforming in mmWave multiple input multiple output (MIMO) networks is proposed to maximize the sum-rate of all UEs. Following the analysis of mmWave cellular networks, a three-tier heterogeneous network is considered, where access points (APs), small-cell BSs (SBSs) and macrocell BSs (MBSs) transmit in THz, mmWave, microwave frequency bands, respectively. By using tools from stochastic geometry, the complementary cumulative distribution function (CCDF) of the received signal power, the Laplace transform of the aggregate interference, and the SINR coverage probability are determined. Next, system-level performance of UAV-enabled cellular networks is studied. More specifically, in the first part, UAV-assisted mmWave cellular networks are addressed, in which the UE locations are modeled using PCPs. In the downlink phase, simultaneous wireless information and power transfer (SWIPT) technique is considered. The association probability, energy coverages and a successful transmission probability to jointly determine the energy and SINR coverages are derived. In the uplink phase, a scenario that each UAV receives information from its own cluster member UEs is taken into account. The Laplace transform of the interference components and the uplink SINR coverage are characterized. In the second part, cellular-connected UAV networks is investigated, in which the UAVs are aerial UEs served by the ground base stations (GBSs). 3D antenna radiation combing the vertical and horizontal patterns is taken into account. In the final part of this thesis, deep reinforcement learning based algorithms are proposed for UAV path planning in cellular networks. Particularly, in the first part, multi-UAV non-cooperative scenarios is considered, where multiple UAVs need to fly from initial locations to destinations, while satisfying collision avoidance, wireless connectivity and kinematic constraints. The goal is to find trajectories for the cellular-connected UAVs to minimize their mission completion time. The multi-UAV trajectory optimization problem is formulated as a sequential decision making problem, and a decentralized DRL approach is proposed to solve the problem. Moreover, multiple UAV trajectory design in cellular networks with a dynamic jammer is studied, and a learning-based algorithm is proposed. Subsequently, a UAV trajectory optimization problem is considered to maximize the collected data from multiple Internet of things (IoT) nodes under realistic constraints. The problem is translated into a Markov decision process (MDP) and dueling double deep Q-network (D3QN) is proposed to learn the decision making policy

    Designing problem-specific operators for solving the Cell Switch-Off problem in ultra-dense 5G networks with hybrid MOEAs

    Get PDF
    The massive deployment of base stations is one of the key pillars of the fifth generation (5G) of mobile communications. However, this network densification entails high energy consumption that must be addressed to enhance the sustainability of this industry. This work faces this problem from a multi-objective optimization perspective, in which both energy efficiency and quality of service criteria are taken into account. To do so, several newly problem-specific operators have been designed so as to engineer hybrid multi-objective evolutionary metaheuristics (MOEAs) that bring expert knowledge of the domain to the search of the algorithms. These hybrid approaches have been able to improve upon canonical versions of the algorithms, clearly showing the contributions of our approach. Furthermore, this paper tests the hypothesis that the hybridization using several of those problem-specific operators simultaneously can enhance the search of MOEAs that are endowed only with a single one.Funding for open access charge: Universidad de Málaga / CBUA This work has been partially funded by the Spanish Ministry of Science and Innovation via grant PID2020-112545RB-C54, by the European Union NextGenerationEU/PRTR under grants TED2021-131699B-I00 and TED2021-129938B-I00 (MCIN/AEI/10.13039/501100011033, FEDER) and the Andalusian PAIDI program with grants A-TIC-608-UGR20, P18.RT.4830, and PYC20-RE-012-UGR. The authors also thank the Supercomputing and Bioinformatics Center of the Universidad de Málaga, for providing its services and the Picasso supercomputer facilities to perform the experiments (http://www.scbi.uma.es/). Funding for open access charge: Universidad de Málaga/CBUA
    corecore