31 research outputs found

    Cellular, Wide-Area, and Non-Terrestrial IoT: A Survey on 5G Advances and the Road Towards 6G

    Full text link
    The next wave of wireless technologies is proliferating in connecting things among themselves as well as to humans. In the era of the Internet of things (IoT), billions of sensors, machines, vehicles, drones, and robots will be connected, making the world around us smarter. The IoT will encompass devices that must wirelessly communicate a diverse set of data gathered from the environment for myriad new applications. The ultimate goal is to extract insights from this data and develop solutions that improve quality of life and generate new revenue. Providing large-scale, long-lasting, reliable, and near real-time connectivity is the major challenge in enabling a smart connected world. This paper provides a comprehensive survey on existing and emerging communication solutions for serving IoT applications in the context of cellular, wide-area, as well as non-terrestrial networks. Specifically, wireless technology enhancements for providing IoT access in fifth-generation (5G) and beyond cellular networks, and communication networks over the unlicensed spectrum are presented. Aligned with the main key performance indicators of 5G and beyond 5G networks, we investigate solutions and standards that enable energy efficiency, reliability, low latency, and scalability (connection density) of current and future IoT networks. The solutions include grant-free access and channel coding for short-packet communications, non-orthogonal multiple access, and on-device intelligence. Further, a vision of new paradigm shifts in communication networks in the 2030s is provided, and the integration of the associated new technologies like artificial intelligence, non-terrestrial networks, and new spectra is elaborated. Finally, future research directions toward beyond 5G IoT networks are pointed out.Comment: Submitted for review to IEEE CS&

    Enabling Technologies for Ultra-Reliable and Low Latency Communications: From PHY and MAC Layer Perspectives

    Full text link
    © 1998-2012 IEEE. Future 5th generation networks are expected to enable three key services-enhanced mobile broadband, massive machine type communications and ultra-reliable and low latency communications (URLLC). As per the 3rd generation partnership project URLLC requirements, it is expected that the reliability of one transmission of a 32 byte packet will be at least 99.999% and the latency will be at most 1 ms. This unprecedented level of reliability and latency will yield various new applications, such as smart grids, industrial automation and intelligent transport systems. In this survey we present potential future URLLC applications, and summarize the corresponding reliability and latency requirements. We provide a comprehensive discussion on physical (PHY) and medium access control (MAC) layer techniques that enable URLLC, addressing both licensed and unlicensed bands. This paper evaluates the relevant PHY and MAC techniques for their ability to improve the reliability and reduce the latency. We identify that enabling long-term evolution to coexist in the unlicensed spectrum is also a potential enabler of URLLC in the unlicensed band, and provide numerical evaluations. Lastly, this paper discusses the potential future research directions and challenges in achieving the URLLC requirements

    Real-Time Waveform Prototyping

    Get PDF
    Mobile Netzwerke der fünften Generation zeichen sich aus durch vielfältigen Anforderungen und Einsatzszenarien. Drei unterschiedliche Anwendungsfälle sind hierbei besonders relevant: 1) Industrie-Applikationen fordern Echtzeitfunkübertragungen mit besonders niedrigen Ausfallraten. 2) Internet-of-things-Anwendungen erfordern die Anbindung einer Vielzahl von verteilten Sensoren. 3) Die Datenraten für Anwendung wie z.B. der Übermittlung von Videoinhalten sind massiv gestiegen. Diese zum Teil gegensätzlichen Erwartungen veranlassen Forscher und Ingenieure dazu, neue Konzepte und Technologien für zukünftige drahtlose Kommunikationssysteme in Betracht zu ziehen. Ziel ist es, aus einer Vielzahl neuer Ideen vielversprechende Kandidatentechnologien zu identifizieren und zu entscheiden, welche für die Umsetzung in zukünftige Produkte geeignet sind. Die Herausforderungen, diese Anforderungen zu erreichen, liegen jedoch jenseits der Möglichkeiten, die eine einzelne Verarbeitungsschicht in einem drahtlosen Netzwerk bieten kann. Daher müssen mehrere Forschungsbereiche Forschungsideen gemeinsam nutzen. Diese Arbeit beschreibt daher eine Plattform als Basis für zukünftige experimentelle Erforschung von drahtlosen Netzwerken unter reellen Bedingungen. Es werden folgende drei Aspekte näher vorgestellt: Zunächst erfolgt ein Überblick über moderne Prototypen und Testbed-Lösungen, die auf großes Interesse, Nachfrage, aber auch Förderungsmöglichkeiten stoßen. Allerdings ist der Entwicklungsaufwand nicht unerheblich und richtet sich stark nach den gewählten Eigenschaften der Plattform. Der Auswahlprozess ist jedoch aufgrund der Menge der verfügbaren Optionen und ihrer jeweiligen (versteckten) Implikationen komplex. Daher wird ein Leitfaden anhand verschiedener Beispiele vorgestellt, mit dem Ziel Erwartungen im Vergleich zu den für den Prototyp erforderlichen Aufwänden zu bewerten. Zweitens wird ein flexibler, aber echtzeitfähiger Signalprozessor eingeführt, der auf einer software-programmierbaren Funkplattform läuft. Der Prozessor ermöglicht die Rekonfiguration wichtiger Parameter der physikalischen Schicht während der Laufzeit, um eine Vielzahl moderner Wellenformen zu erzeugen. Es werden vier Parametereinstellungen 'LLC', 'WiFi', 'eMBB' und 'IoT' vorgestellt, um die Anforderungen der verschiedenen drahtlosen Anwendungen widerzuspiegeln. Diese werden dann zur Evaluierung der die in dieser Arbeit vorgestellte Implementierung herangezogen. Drittens wird durch die Einführung einer generischen Testinfrastruktur die Einbeziehung externer Partner aus der Ferne ermöglicht. Das Testfeld kann hier für verschiedenste Experimente flexibel auf die Anforderungen drahtloser Technologien zugeschnitten werden. Mit Hilfe der Testinfrastruktur wird die Leistung des vorgestellten Transceivers hinsichtlich Latenz, erreichbarem Durchsatz und Paketfehlerraten bewertet. Die öffentliche Demonstration eines taktilen Internet-Prototypen, unter Verwendung von Roboterarmen in einer Mehrbenutzerumgebung, konnte erfolgreich durchgeführt und bei mehreren Gelegenheiten präsentiert werden.:List of figures List of tables Abbreviations Notations 1 Introduction 1.1 Wireless applications 1.2 Motivation 1.3 Software-Defined Radio 1.4 State of the art 1.5 Testbed 1.6 Summary 2 Background 2.1 System Model 2.2 PHY Layer Structure 2.3 Generalized Frequency Division Multiplexing 2.4 Wireless Standards 2.4.1 IEEE 802.15.4 2.4.2 802.11 WLAN 2.4.3 LTE 2.4.4 Low Latency Industrial Wireless Communications 2.4.5 Summary 3 Wireless Prototyping 3.1 Testbed Examples 3.1.1 PHY - focused Testbeds 3.1.2 MAC - focused Testbeds 3.1.3 Network - focused testbeds 3.1.4 Generic testbeds 3.2 Considerations 3.3 Use cases and Scenarios 3.4 Requirements 3.5 Methodology 3.6 Hardware Platform 3.6.1 Host 3.6.2 FPGA 3.6.3 Hybrid 3.6.4 ASIC 3.7 Software Platform 3.7.1 Testbed Management Frameworks 3.7.2 Development Frameworks 3.7.3 Software Implementations 3.8 Deployment 3.9 Discussion 3.10 Conclusion 4 Flexible Transceiver 4.1 Signal Processing Modules 4.1.1 MAC interface 4.1.2 Encoding and Mapping 4.1.3 Modem 4.1.4 Post modem processing 4.1.5 Synchronization 4.1.6 Channel Estimation and Equalization 4.1.7 Demapping 4.1.8 Flexible Configuration 4.2 Analysis 4.2.1 Numerical Precision 4.2.2 Spectral analysis 4.2.3 Latency 4.2.4 Resource Consumption 4.3 Discussion 4.3.1 Extension to MIMO 4.4 Summary 5 Testbed 5.1 Infrastructure 5.2 Automation 5.3 Software Defined Radio Platform 5.4 Radio Frequency Front-end 5.4.1 Sub 6 GHz front-end 5.4.2 26 GHz mmWave front-end 5.5 Performance evaluation 5.6 Summary 6 Experiments 6.1 Single Link 6.1.1 Infrastructure 6.1.2 Single Link Experiments 6.1.3 End-to-End 6.2 Multi-User 6.3 26 GHz mmWave experimentation 6.4 Summary 7 Key lessons 7.1 Limitations Experienced During Development 7.2 Prototyping Future 7.3 Open points 7.4 Workflow 7.5 Summary 8 Conclusions 8.1 Future Work 8.1.1 Prototyping Workflow 8.1.2 Flexible Transceiver Core 8.1.3 Experimental Data-sets 8.1.4 Evolved Access Point Prototype For Industrial Networks 8.1.5 Testbed Standardization A Additional Resources A.1 Fourier Transform Blocks A.2 Resource Consumption A.3 Channel Sounding using Chirp sequences A.3.1 SNR Estimation A.3.2 Channel Estimation A.4 Hardware part listThe demand to achieve higher data rates for the Enhanced Mobile Broadband scenario and novel fifth generation use cases like Ultra-Reliable Low-Latency and Massive Machine-type Communications drive researchers and engineers to consider new concepts and technologies for future wireless communication systems. The goal is to identify promising candidate technologies among a vast number of new ideas and to decide, which are suitable for implementation in future products. However, the challenges to achieve those demands are beyond the capabilities a single processing layer in a wireless network can offer. Therefore, several research domains have to collaboratively exploit research ideas. This thesis presents a platform to provide a base for future applied research on wireless networks. Firstly, by giving an overview of state-of-the-art prototypes and testbed solutions. Secondly by introducing a flexible, yet real-time physical layer signal processor running on a software defined radio platform. The processor enables reconfiguring important parameters of the physical layer during run-time in order to create a multitude of modern waveforms. Thirdly, by introducing a generic test infrastructure, which can be tailored to prototype diverse wireless technology and which is remotely accessible in order to invite new ideas by third parties. Using the test infrastructure, the performance of the flexible transceiver is evaluated regarding latency, achievable throughput and packet error rates.:List of figures List of tables Abbreviations Notations 1 Introduction 1.1 Wireless applications 1.2 Motivation 1.3 Software-Defined Radio 1.4 State of the art 1.5 Testbed 1.6 Summary 2 Background 2.1 System Model 2.2 PHY Layer Structure 2.3 Generalized Frequency Division Multiplexing 2.4 Wireless Standards 2.4.1 IEEE 802.15.4 2.4.2 802.11 WLAN 2.4.3 LTE 2.4.4 Low Latency Industrial Wireless Communications 2.4.5 Summary 3 Wireless Prototyping 3.1 Testbed Examples 3.1.1 PHY - focused Testbeds 3.1.2 MAC - focused Testbeds 3.1.3 Network - focused testbeds 3.1.4 Generic testbeds 3.2 Considerations 3.3 Use cases and Scenarios 3.4 Requirements 3.5 Methodology 3.6 Hardware Platform 3.6.1 Host 3.6.2 FPGA 3.6.3 Hybrid 3.6.4 ASIC 3.7 Software Platform 3.7.1 Testbed Management Frameworks 3.7.2 Development Frameworks 3.7.3 Software Implementations 3.8 Deployment 3.9 Discussion 3.10 Conclusion 4 Flexible Transceiver 4.1 Signal Processing Modules 4.1.1 MAC interface 4.1.2 Encoding and Mapping 4.1.3 Modem 4.1.4 Post modem processing 4.1.5 Synchronization 4.1.6 Channel Estimation and Equalization 4.1.7 Demapping 4.1.8 Flexible Configuration 4.2 Analysis 4.2.1 Numerical Precision 4.2.2 Spectral analysis 4.2.3 Latency 4.2.4 Resource Consumption 4.3 Discussion 4.3.1 Extension to MIMO 4.4 Summary 5 Testbed 5.1 Infrastructure 5.2 Automation 5.3 Software Defined Radio Platform 5.4 Radio Frequency Front-end 5.4.1 Sub 6 GHz front-end 5.4.2 26 GHz mmWave front-end 5.5 Performance evaluation 5.6 Summary 6 Experiments 6.1 Single Link 6.1.1 Infrastructure 6.1.2 Single Link Experiments 6.1.3 End-to-End 6.2 Multi-User 6.3 26 GHz mmWave experimentation 6.4 Summary 7 Key lessons 7.1 Limitations Experienced During Development 7.2 Prototyping Future 7.3 Open points 7.4 Workflow 7.5 Summary 8 Conclusions 8.1 Future Work 8.1.1 Prototyping Workflow 8.1.2 Flexible Transceiver Core 8.1.3 Experimental Data-sets 8.1.4 Evolved Access Point Prototype For Industrial Networks 8.1.5 Testbed Standardization A Additional Resources A.1 Fourier Transform Blocks A.2 Resource Consumption A.3 Channel Sounding using Chirp sequences A.3.1 SNR Estimation A.3.2 Channel Estimation A.4 Hardware part lis

    Machine Learning for Unmanned Aerial System (UAS) Networking

    Get PDF
    Fueled by the advancement of 5G new radio (5G NR), rapid development has occurred in many fields. Compared with the conventional approaches, beamforming and network slicing enable 5G NR to have ten times decrease in latency, connection density, and experienced throughput than 4G long term evolution (4G LTE). These advantages pave the way for the evolution of Cyber-physical Systems (CPS) on a large scale. The reduction of consumption, the advancement of control engineering, and the simplification of Unmanned Aircraft System (UAS) enable the UAS networking deployment on a large scale to become feasible. The UAS networking can finish multiple complex missions simultaneously. However, the limitations of the conventional approaches are still a big challenge to make a trade-off between the massive management and efficient networking on a large scale. With 5G NR and machine learning, in this dissertation, my contributions can be summarized as the following: I proposed a novel Optimized Ad-hoc On-demand Distance Vector (OAODV) routing protocol to improve the throughput of Intra UAS networking. The novel routing protocol can reduce the system overhead and be efficient. To improve the security, I proposed a blockchain scheme to mitigate the malicious basestations for cellular connected UAS networking and a proof-of-traffic (PoT) to improve the efficiency of blockchain for UAS networking on a large scale. Inspired by the biological cell paradigm, I proposed the cell wall routing protocols for heterogeneous UAS networking. With 5G NR, the inter connections between UAS networking can strengthen the throughput and elasticity of UAS networking. With machine learning, the routing schedulings for intra- and inter- UAS networking can enhance the throughput of UAS networking on a large scale. The inter UAS networking can achieve the max-min throughput globally edge coloring. I leveraged the upper and lower bound to accelerate the optimization of edge coloring. This dissertation paves a way regarding UAS networking in the integration of CPS and machine learning. The UAS networking can achieve outstanding performance in a decentralized architecture. Concurrently, this dissertation gives insights into UAS networking on a large scale. These are fundamental to integrating UAS and National Aerial System (NAS), critical to aviation in the operated and unmanned fields. The dissertation provides novel approaches for the promotion of UAS networking on a large scale. The proposed approaches extend the state-of-the-art of UAS networking in a decentralized architecture. All the alterations can contribute to the establishment of UAS networking with CPS

    Non-Orthogonal Signal and System Design for Wireless Communications

    Get PDF
    The thesis presents research in non-orthogonal multi-carrier signals, in which: (i) a new signal format termed truncated orthogonal frequency division multiplexing (TOFDM) is proposed to improve data rates in wireless communication systems, such as those used in mobile/cellular systems and wireless local area networks (LANs), and (ii) a new design and experimental implementation of a real-time spectrally efficient frequency division multiplexing (SEFDM) system are reported. This research proposes a modified version of the orthogonal frequency division multiplexing (OFDM) format, obtained by truncating OFDM symbols in the time-domain. In TOFDM, subcarriers are no longer orthogonally packed in the frequency-domain as time samples are only partially transmitted, leading to improved spectral efficiency. In this work, (i) analytical expressions are derived for the newly proposed TOFDM signal, followed by (ii) interference analysis, (iii) systems design for uncoded and coded schemes, (iv) experimental implementation and (v) performance evaluation of the new proposed signal and system, with comparisons to conventional OFDM systems. Results indicate that signals can be recovered with truncated symbol transmission. Based on the TOFDM principle, a new receiving technique, termed partial symbol recovery (PSR), is designed and implemented in software de ned radio (SDR), that allows efficient operation of two users for overlapping data, in wireless communication systems operating with collisions. The PSR technique is based on recovery of collision-free partial OFDM symbols, followed by the reconstruction of complete symbols to recover progressively the frames of two users suffering collisions. The system is evaluated in a testbed of 12-nodes using SDR platforms. The thesis also proposes channel estimation and equalization technique for non-orthogonal signals in 5G scenarios, using an orthogonal demodulator and zero padding. Finally, the implementation of complete SEFDM systems in real-time is investigated and described in detail

    Smart Sensor Technologies for IoT

    Get PDF
    The recent development in wireless networks and devices has led to novel services that will utilize wireless communication on a new level. Much effort and resources have been dedicated to establishing new communication networks that will support machine-to-machine communication and the Internet of Things (IoT). In these systems, various smart and sensory devices are deployed and connected, enabling large amounts of data to be streamed. Smart services represent new trends in mobile services, i.e., a completely new spectrum of context-aware, personalized, and intelligent services and applications. A variety of existing services utilize information about the position of the user or mobile device. The position of mobile devices is often achieved using the Global Navigation Satellite System (GNSS) chips that are integrated into all modern mobile devices (smartphones). However, GNSS is not always a reliable source of position estimates due to multipath propagation and signal blockage. Moreover, integrating GNSS chips into all devices might have a negative impact on the battery life of future IoT applications. Therefore, alternative solutions to position estimation should be investigated and implemented in IoT applications. This Special Issue, “Smart Sensor Technologies for IoT” aims to report on some of the recent research efforts on this increasingly important topic. The twelve accepted papers in this issue cover various aspects of Smart Sensor Technologies for IoT

    Sensor Signal and Information Processing II

    Get PDF
    In the current age of information explosion, newly invented technological sensors and software are now tightly integrated with our everyday lives. Many sensor processing algorithms have incorporated some forms of computational intelligence as part of their core framework in problem solving. These algorithms have the capacity to generalize and discover knowledge for themselves and learn new information whenever unseen data are captured. The primary aim of sensor processing is to develop techniques to interpret, understand, and act on information contained in the data. The interest of this book is in developing intelligent signal processing in order to pave the way for smart sensors. This involves mathematical advancement of nonlinear signal processing theory and its applications that extend far beyond traditional techniques. It bridges the boundary between theory and application, developing novel theoretically inspired methodologies targeting both longstanding and emergent signal processing applications. The topic ranges from phishing detection to integration of terrestrial laser scanning, and from fault diagnosis to bio-inspiring filtering. The book will appeal to established practitioners, along with researchers and students in the emerging field of smart sensors processing
    corecore