49 research outputs found

    Towards low-cost gigabit wireless systems at 60 GHz

    Get PDF
    The world-wide availability of the huge amount of license-free spectral space in the 60 GHz band provides wide room for gigabit-per-second (Gb/s) wireless applications. A commercial (read: low-cost) 60-GHz transceiver will, however, provide limited system performance due to the stringent link budget and the substantial RF imperfections. The work presented in this thesis is intended to support the design of low-cost 60-GHz transceivers for Gb/s transmission over short distances (a few meters). Typical applications are the transfer of high-definition streaming video and high-speed download. The presented work comprises research into the characteristics of typical 60-GHz channels, the evaluation of the transmission quality as well as the development of suitable baseband algorithms. This can be summarized as follows. In the first part, the characteristics of the wave propagation at 60 GHz are charted out by means of channel measurements and ray-tracing simulations for both narrow-beam and omni-directional configurations. Both line-of-sight (LOS) and non-line-of-sight (NLOS) are considered. This study reveals that antennas that produce a narrow beam can be used to boost the received power by tens of dBs when compared with omnidirectional configurations. Meanwhile, the time-domain dispersion of the channel is reduced to the order of nanoseconds, which facilitates Gb/s data transmission over 60-GHz channels considerably. Besides the execution of measurements and simulations, the influence of antenna radiation patterns is analyzed theoretically. It is indicated to what extent the signal-to-noise ratio, Rician-K factor and channel dispersion are improved by application of narrow-beam antennas and to what extent these parameters will be influenced by beam pointing errors. From both experimental and analytical work it can be concluded that the problem of the stringent link-budget can be solved effectively by application of beam-steering techniques. The second part treats wideband transmission methods and relevant baseband algorithms. The considered schemes include orthogonal frequency division multiplexing (OFDM), multi-carrier code division multiple access (MC-CDMA) and single carrier with frequency-domain equalization (SC-FDE), which are promising candidates for Gb/s wireless transmission. In particular, the optimal linear equalization in the frei quency domain and associated implementation issues such as synchronization and channel estimation are examined. Bit error rate (BER) expressions are derived to evaluate the transmission performance. Besides the linear equalization techniques, a low-complexity inter-symbol interference cancellation technique is proposed to achieve much better performance of code-spreading systems such as MC-CDMA and SC-FDE. Both theoretical analysis and simulations demonstrate that the proposed scheme offers great advantages as regards both complexity and performance. This makes it particularly suitable for 60-GHz applications in multipath environments. The third part treats the influence of quantization and RF imperfections on the considered transmission methods in the context of 60-GHz radios. First, expressions for the BER are derived and the influence of nonlinear distortions caused by the digital-to-analog converters, analog-to-digital converters and power amplifiers on the BER performance is examined. Next, the BER performance under the influence of phase noise and IQ imbalance is evaluated for the case that digital compensation techniques are applied in the receiver as well as for the case that such techniques are not applied. Finally, a baseline design of a low-cost Gb/s 60-GHz transceiver is presented. It is shown that, by application of beam-steering in combination with SC-FDE without advanced channel coding, a data rate in the order of 2 Gb/s can be achieved over a distance of 10 meters in a typical NLOS indoor scenario

    Visible Light Communication (VLC)

    Get PDF
    Visible light communication (VLC) using light-emitting diodes (LEDs) or laser diodes (LDs) has been envisioned as one of the key enabling technologies for 6G and Internet of Things (IoT) systems, owing to its appealing advantages, including abundant and unregulated spectrum resources, no electromagnetic interference (EMI) radiation and high security. However, despite its many advantages, VLC faces several technical challenges, such as the limited bandwidth and severe nonlinearity of opto-electronic devices, link blockage and user mobility. Therefore, significant efforts are needed from the global VLC community to develop VLC technology further. This Special Issue, “Visible Light Communication (VLC)”, provides an opportunity for global researchers to share their new ideas and cutting-edge techniques to address the above-mentioned challenges. The 16 papers published in this Special Issue represent the fascinating progress of VLC in various contexts, including general indoor and underwater scenarios, and the emerging application of machine learning/artificial intelligence (ML/AI) techniques in VLC

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Design of a simulation platform to test next generation of terrestrial DVB

    Get PDF
    Digital Terrestrial Television Broadcasting (DTTB) is a member of our daily life routine, and nonetheless, according to new users’ necessities in the fields of communications and leisure, new challenges are coming up. Moreover, the current Standard is not able to satisfy all the potential requirements. For that reason, first of all, a review of the current Standard has been performed within this work. Then, it has been identified the needing of developing a new version of the standard, ready to support enhanced services, as for example broadcasting transmissions to moving terminals or High Definition Television (HDTV) transmissions, among others. The main objective of this project is the design and development of a physical layer simulator of the whole DVB-T standard, including both the complete transmission and reception procedures. The simulator has been developed in Matlab. A detailed description of the simulator both from a functional and an architectural point of view is included. The simulator is the base for testing any possible modifications that may be included into the DVB-T2 future standard. In fact, several proposed enhancements have already been carried out and their performance has been evaluated. Specifically, the use of higher order modulation schemes, and the corresponding modifications in all the system blocks, have been included and evaluated. Furthermore, the simulator will allow testing other enhancements as the use of more efficient encoders and interleavers, MIMO technologies, and so on. A complete set of numerical results showing the performance of the different parts of the system, are presented in order to validate the correctness of the implementation and to evaluate both the current standard performance and the proposed enhancements. This work has been performed within the context of a project called FURIA, which is a strategic research project funded by the Spanish Ministry of Industry, Tourism and Commerce. A brief description of this project and its consortium has been also included herein, together with an introduction to the current situation of the DTTB in Spain (called TDT in Spanish)

    Design of a simulation platform to test next generation of terrestrial DVB

    Get PDF
    Digital Terrestrial Television Broadcasting (DTTB) is a member of our daily life routine, and nonetheless, according to new users’ necessities in the fields of communications and leisure, new challenges are coming up. Moreover, the current Standard is not able to satisfy all the potential requirements. For that reason, first of all, a review of the current Standard has been performed within this work. Then, it has been identified the needing of developing a new version of the standard, ready to support enhanced services, as for example broadcasting transmissions to moving terminals or High Definition Television (HDTV) transmissions, among others. The main objective of this project is the design and development of a physical layer simulator of the whole DVB-T standard, including both the complete transmission and reception procedures. The simulator has been developed in Matlab. A detailed description of the simulator both from a functional and an architectural point of view is included. The simulator is the base for testing any possible modifications that may be included into the DVB-T2 future standard. In fact, several proposed enhancements have already been carried out and their performance has been evaluated. Specifically, the use of higher order modulation schemes, and the corresponding modifications in all the system blocks, have been included and evaluated. Furthermore, the simulator will allow testing other enhancements as the use of more efficient encoders and interleavers, MIMO technologies, and so on. A complete set of numerical results showing the performance of the different parts of the system, are presented in order to validate the correctness of the implementation and to evaluate both the current standard performance and the proposed enhancements. This work has been performed within the context of a project called FURIA, which is a strategic research project funded by the Spanish Ministry of Industry, Tourism and Commerce. A brief description of this project and its consortium has been also included herein, together with an introduction to the current situation of the DTTB in Spain (called TDT in Spanish)

    Improving Wifi Sensing And Networking With Channel State Information

    Get PDF
    In recent years, WiFi has a very rapid growth due to its high throughput, high efficiency, and low costs. Multiple-Input Multiple-Output (MIMO) and Orthogonal Frequency-Division Multiplexing (OFDM) are two key technologies for providing high throughput and efficiency for WiFi systems. MIMO-OFDM provides Channel State Information (CSI) which represents the amplitude attenuation and phase shift of each transmit-receiver antenna pair of each carrier frequency. CSI helps WiFi achieve high throughput to meet the growing demands of wireless data traffic. CSI captures how wireless signals travel through the surrounding environment, so it can also be used for wireless sensing purposes. This dissertation presents how to improve WiFi sensing and networking with CSI. More specifically, this dissertation proposes deep learning models to improve the performance and capability of WiFi sensing and presents network protocols to reduce CSI feedback overhead for high efficiency WiFi networking. For WiFi sensing, there are many wireless sensing applications using CSI as the input in recent years. To get a better understanding of existing WiFi sensing technologies and future WiFi sensing trends, this dissertation presents a survey of signal processing techniques, algorithms, applications, performance results, challenges, and future trends of CSI-based WiFi sensing. CSI is widely used for gesture recognition and sign language recognition. Existing methods for WiFi-based sign language recognition have low accuracy and high costs when there are more than 200 sign gestures. The dissertation presents SignFi for sign language recognition using CSI and Convolutional Neural Networks (CNNs). SignFi provides high accuracy and low costs for run-time testing for 276 sign gestures in the lab and home environments. For WiFi networking, although CSI provides high throughput for WiFi networks, it also introduces high overhead. WiFi transmitters need CSI feedback for transmit beamforming and rate adaptation. The size of CSI packets is very large and it grows very fast with respect to the number of antennas and channel width. CSI feedback introduces high overhead which reduces the performance and efficiency of WiFi systems, especially mobile and hand-held WiFi devices. This dissertation presents RoFi to reduce CSI feedback overhead based on the mobility status of WiFi receivers. CSI feedback compression reduces overhead, but WiFi receivers still need to send CSI feedback to the WiFi transmitter. The dissertation presents EliMO for eliminating CSI feedback without sacrificing beamforming gains

    Signal Detection in Ambient Backscatter Systems: Fundamentals, Methods, and Trends

    Full text link
    Internet-of-Things (IoT) is rapidly growing in wireless technology, aiming to connect vast numbers of devices to gather and distribute vital information. Despite individual devices having low energy consumption, the cumulative demand results in significant energy usage. Consequently, the concept of ultra-low-power tags gains appeal. Such tags communicate by reflecting rather than generating the radio frequency (RF) signals by themselves. Thus, these backscatter tags can be low-cost and battery-free. The RF signals can be ambient sources such as wireless-fidelity (Wi-Fi), cellular, or television (TV) signals, or the system can generate them externally. Backscatter channel characteristics are different from conventional point-to-point or cooperative relay channels. These systems are also affected by a strong interference link between the RF source and the tag besides the direct and backscattering links, making signal detection challenging. This paper provides an overview of the fundamentals, challenges, and ongoing research in signal detection for AmBC networks. It delves into various detection methods, discussing their advantages and drawbacks. The paper's emphasis on signal detection sets it apart and positions it as a valuable resource for IoT and wireless communication professionals and researchers.Comment: Accepted for publication in the IEEE Acces
    corecore