22 research outputs found

    FBMC-based random access signal design and detection for LEO base stations

    Get PDF
    The integration of non-terrestrial networks into the 5G ecosystem is mainly driven by the possibility of provisioning service in remote areas. In this context, the advent of flying base stations at the low Earth orbit (LEO) will enable anywhere and anytime connectivity. To materialize this vision, it is of utmost importance to improve radio protocols with the aim of allowing direct satellite access. Bearing this aspect in mind, we present a new random access signal, which is based on the filter bank multicarrier (FBMC) waveform, and a computationally efficient detection scheme. The proposed solution outperforms the standardized access scheme based on single-carrier frequency division multiplexing (SC-FDM), by reducing out-of-band (OOB) emissions and reducing the missed detection probability in presence of very high carrier frequency offset (CFO), which is inherent to LEO satellite systems. The improvement is related to the fine frequency resolution of the detector and the use of pulse shaping techniques. Interestingly, the FBMC-based random access signal achieves a high level of commonality with 5G new radio, as the preamble generation method and the time-frequency allocation pattern can be kept unchanged. Concerning the practical implementation aspects, the complexity of the detector is similar in both SC-FDM and FBMC.This paper is part of the R+D+i project (PID2020-115323RB-C31) funded by MCIN/AEI/ 10.13039/501100011033. This work is supported by the grant from Spanish Ministry of Economic Affairs and Digital Transformation and the European union - NextGenerationEU (UNICO-5G I+D/AROMA3D-Space (TSI-063000-2021-70))Peer ReviewedPostprint (author's final draft

    LTE Synchronization Algorithms

    Get PDF
    Third Generation Partnership Project (3GPP) Long Term Evolution(LTE) provides high spectral effciency, high peak data rates and frequency exibility with low latency and low cost. Usage of modulation techniques like OFDMA and SCFDMA enables LTE to achieve such requirements.However it is well known that OFDM systems like LTE are very sensitive when it comes to carrier frequency off- sets and symbol synchronization errors.Inorder to transfer data correctly, the User Equipment(UE) must perform both uplink and downlink synchronization with the Base station(eNodeB). In this thesis, Primary Synchronization Signal(PSS) and Secondary Synchronization signal(SSS) are used to detect the cell-ID of the best serving base station and also to achieve downlink synchronization whereas Physical Random Access Channel(PRACH) preamble is used to obtain uplink synchronization. PSS and SSS helps to achieve downlink synchronization by estimating the frame timing and carrier frequency offsets.In this thesis a non-coherent detection approach is followed for detection of both PSS and SSS signals

    Physical Layer Techniques for Wireless Communication Systems

    Get PDF
    The increasing diffusion of mobile devices requiring, everywhere and every time, reliable connections able to support the more common applications, induced in the last years the deployment of telecommunication networks based on technologies capable to respond effectively to the ever-increasing market demand, still a long way off from saturation level. Multicarrier transmission techniques employed in standards for local networks (Wi-Fi) and metropolitan networks (WiMAX) and for many years hot research topic, have been definitely adopted beginning from the fourth generation of cellular systems (LTE). The adoption of multicarrier signaling techniques if on one hand has brought significant advantages to counteract the detrimental effects in environments with particularly harsh propagation channel, on the other hand, has imposed very strict requirements on sensitivity to recovery errors of the carrier frequency offset (CFO) due to the resulting impact on correct signal detection. The main focus of the thesis falls in this area, investigating some aspects relating to synchronization procedures for system based on multicarrier signaling. Particular reference will be made to a network entry procedure for LTE networks and to CFO recovery for OFDM, fltered multitone modulation and direct conversion receivers. Other contributions pertaining to physical layer issues for communication systems, both radio and over acoustic carrier, conclude the thesis

    NB-IoT via non terrestrial networks

    Get PDF
    Massive Internet of Things is expected to play a crucial role in Beyond 5G (B5G) wireless communication systems, offering seamless connectivity among heterogeneous devices without human intervention. However, the exponential proliferation of smart devices and IoT networks, relying solely on terrestrial networks, may not fully meet the demanding IoT requirements in terms of bandwidth and connectivity, especially in areas where terrestrial infrastructures are not economically viable. To unleash the full potential of 5G and B5G networks and enable seamless connectivity everywhere, the 3GPP envisions the integration of Non-Terrestrial Networks (NTNs) into the terrestrial ones starting from Release 17. However, this integration process requires modifications to the 5G standard to ensure reliable communications despite typical satellite channel impairments. In this framework, this thesis aims at proposing techniques at the Physical and Medium Access Control layers that require minimal adaptations in the current NB-IoT standard via NTN. Thus, firstly the satellite impairments are evaluated and, then, a detailed link budget analysis is provided. Following, analyses at the link and the system levels are conducted. In the former case, a novel algorithm leveraging time-frequency analysis is proposed to detect orthogonal preambles and estimate the signals’ arrival time. Besides, the effects of collisions on the detection probability and Bit Error Rate are investigated and Non-Orthogonal Multiple Access approaches are proposed in the random access and data phases. The system analysis evaluates the performance of random access in case of congestion. Various access parameters are tested in different satellite scenarios, and the performance is measured in terms of access probability and time required to complete the procedure. Finally, a heuristic algorithm is proposed to jointly design the access and data phases, determining the number of satellite passages, the Random Access Periodicity, and the number of uplink repetitions that maximize the system's spectral efficiency

    Modelaçcão comportamental da camada física NB-IoT - Uplink

    Get PDF
    Mestrado em Engenharia Eletrónica e TelecomunicaçõesA Internet das Coisas (IoT) consiste numa rede sem fios de sensores/atuadores ligados entre si e que têm a capacidade de recolher dados. Devido ao crescimento rápido do mercado IoT, as redes de longa distância e baixa potência (LPWAN) tornaram-se populares. O NarrowBand-IoT (NB-IoT), desenvolvido pela 3rd Generation Partnership Project (3GPP), é um desses protocolos. O principal objectivo desta dissertação é a implementação de uma simulação comportamental em MATLAB do NB-IoT no uplink, que será disponibilizada abertamente. Esta será focada, primariamente, na camada física e nas suas respetivas funcionalidades, nomeadamente turbo coding, modulação SC-FDMA, modelos de simulação de canal, desmodulação SC-FDMA, estimação de canal, equalizador e turbo decoding. A estimação de canal é feita usando símbolos piloto previamente conhecidos. Os modelos de canal utilizados são baseados nas especificações oficiais da 3GPP. A taxa de bits errados (BER) é calculada e usada de forma a avaliar a performance do turbo encoder e do equalizador zero forcing (ZF). Serve também como comparação quando a implementação usa esquemas de modulação diferentes (Binary Phase-Shift Keying (BPSK) e Quadrature Phase-Shift Keying (QPSK)). Além disso, os sinais gerados em MATLAB são transmitidos usando como front-end de radio-frequência (RF) uma Universal Software Radio Peripheral (USRP). Posteriormente, são recebidos, desmodulados e descodificados. Finalmente, é obtida a constelação do sinal, a BER é calculada e os resultados são analisados.The Internet of Things (IoT) refers to a wireless network of interconnected sensors/actuators with data-collecting technologies. Low Power Wide Area Networks (LPWAN) have become popular due to the rapid growth of the IoT market. Narrowband-IoT (NB-IoT), developed by 3rd Generation Partnership Project (3GPP), is one of these protocols. The main objective of this thesis is the implementation of an open-source uplink behavioral simulator based on MATLAB. Its focus is primarily on Layer 1 (physical layer) relevant functionalities, namely turbo coding, Single-Carrier Frequency-Division Multiple Access (SC-FDMA) modulation, channel modeling, SC-FDMA demodulation, channel estimation, equalization and turbo decoding. Channel estimation is performed using known pilot symbols. The used channel models are based on the 3GPP o cial release specs. The Bit Error Rate (BER) is calculated in order to evaluate the turbo encoder and the Zero Forcing (ZF) equalizer performance, and to compare Binary Phase-Shift Keying (BPSK) and Quadrature Phase-Shift Keying (QPSK) implementations. Furthermore, the MATLAB generated signal is transmitted using a radio-frequency (RF) front-end consisting of an Universal Software Radio Peripheral (USRP). Afterwards, the signal is received, demodulated and decoded. A constellation is obtained, the BER is calculated and the results are analyzed

    Enabling Technologies for Ultra-Reliable and Low Latency Communications: From PHY and MAC Layer Perspectives

    Full text link
    © 1998-2012 IEEE. Future 5th generation networks are expected to enable three key services-enhanced mobile broadband, massive machine type communications and ultra-reliable and low latency communications (URLLC). As per the 3rd generation partnership project URLLC requirements, it is expected that the reliability of one transmission of a 32 byte packet will be at least 99.999% and the latency will be at most 1 ms. This unprecedented level of reliability and latency will yield various new applications, such as smart grids, industrial automation and intelligent transport systems. In this survey we present potential future URLLC applications, and summarize the corresponding reliability and latency requirements. We provide a comprehensive discussion on physical (PHY) and medium access control (MAC) layer techniques that enable URLLC, addressing both licensed and unlicensed bands. This paper evaluates the relevant PHY and MAC techniques for their ability to improve the reliability and reduce the latency. We identify that enabling long-term evolution to coexist in the unlicensed spectrum is also a potential enabler of URLLC in the unlicensed band, and provide numerical evaluations. Lastly, this paper discusses the potential future research directions and challenges in achieving the URLLC requirements

    Enabling Technologies for Internet of Things: Licensed and Unlicensed Techniques

    Get PDF
    The Internet of Things (IoT) is a novel paradigm which is shaping the evolution of the future Internet. According to the vision underlying the IoT, the next step in increasing the ubiquity of the Internet, after connecting people anytime and everywhere, is to connect inanimate objects. By providing objects with embedded communication capabilities and a common addressing scheme, a highly distributed and ubiquitous network of seamlessly connected heterogeneous devices is formed, which can be fully integrated into the current Internet and mobile networks, thus allowing for the development of new intelligent services available anytime, anywhere, by anyone and anything. Such a vision is also becoming known under the name of Machine-to-Machine (M2M), where the absence of human interaction in the system dynamics is further emphasized. A massive number of wireless devices will have the ability to connect to the Internat through the IoT framework. With the accelerating pace of marketing such framework, the new wireless communications standards are studying/proposing solutions to incorporate the services needed for the IoT. However, with an estimate of 30 billion connected devices, a lot of challenges are facing the current wireless technology. In our research, we address a variety of technology candidates for enabling such a massive framework. Mainly, we focus on the nderlay cognitive radio networks as the unlicensed candidate for IoT. On the other hand, we look into the current efforts done by the standardization bodies to accommodate the requirements of the IoT into the current cellular networks. Specifically, we survey the new features and the new user equipment categories added to the physical layer of the LTE-A. In particular, we study the performance of a dual-hop cognitive radio network sharing the spectrum of a primary network in an underlay fashion. In particular, the cognitive network consists of a source, a destination, and multiple nodes employed as amplify-and-forward relays. To improve the spectral efficiency, all relays are allowed to instantaneously transmit to the destination over the same frequency band. We present the optimal power allocation that maximizes the received signal-to-noise ratio (SNR) at the destination while satisfying the interference constrains of the primary network. The optimal power allocation is obtained through an eigen-solution of a channel-dependent matrix, and is shown to transform the transmission over the non-orthogonal relays into parallel channels. Furthermore, while the secondary destination is equipped with multiple antennas, we propose an antenna selection scheme to select the antenna with the highest SNR. To this end, we propose a clustering scheme to subgroup the available relays and use antenna selection at the receiver to extract the same diversity order. We show that random clustering causes the system to lose some of the available degrees of freedom. We provide analytical expression of the outage probability of the system for the random clustering and the proposed maximum-SNR clustering scheme with antenna selection. In addition, we adapt our design to increase the energy-efficiency of the overall network without significant loss in the data rate. In the second part of this thesis, we will look into the current efforts done by the standardization bodies to accommodate the equirements of the IoT into the current cellular networks. Specifically, we present the new features and the new user equipment categories added to the physical layer of the LTE-A. We study some of the challenges facing the LTE-A when dealing with Machine Type communications (MTC). Specifically, the MTC Physical Downlink control channel (MPDCCH) is among the newly introduced features in the LTE-A that carries the downlink control information (DCI) for MTC devices. Correctly decoding the PDCCH, mainly depends on the channel estimation used to compensate for the channel errors during transmission, and the choice of such technique will affect both the complexity and the performance of the user equipment. We propose and assess the performance of a simple channel estimation technique depends in essence on the Least Squares (LS) estimates of the pilots signal and linear interpolations for low-Doppler channels associated with the MTC application

    Radio frequency dataset collection system development for location and device fingerprinting

    Get PDF
    Radio-frequency (RF) fingerprinting is a process that uses the minute inconsistencies among manufactured radio transmitters to identify wireless devices. Coupled with location fingerprinting, which is a machine learning technique to locate devices based on their radio signals, it can uniquely identify and locate both trusted and rogue wireless devices transmitting over the air. This can have wide-ranging applications for the Internet of Things, security, and networking fields. To contribute to this effort, this research first builds a software-defined radio (SDR) testbed to collect an RF dataset over LTE and WiFi channels. The developed testbed consists of both hardware which are receivers with multiple antennas and software which performs signal preprocessing. Several features that can be used for RF device fingerprinting and location fingerprinting, including received signal strength indicator and channel state information, are also extracted from the signals. With the developed dataset, several data-driven machine learning algorithms have been implemented and tested for fingerprinting performance evaluation. Overall, experimental results show promising performance with a radio fingerprinting accuracy above 90\% and device localization within 1.10 meters
    corecore