3,319 research outputs found

    Artificial Intelligence-aided OFDM Receiver: Design and Experimental Results

    Full text link
    Orthogonal frequency division multiplexing (OFDM) is one of the key technologies that are widely applied in current communication systems. Recently, artificial intelligence (AI)-aided OFDM receivers have been brought to the forefront to break the bottleneck of the traditional OFDM systems. In this paper, we investigate two AI-aided OFDM receivers, data-driven fully connected-deep neural network (FC-DNN) receiver and model-driven ComNet receiver, respectively. We first study their performance under different channel models through simulation and then establish a real-time video transmission system using a 5G rapid prototyping (RaPro) system for over-the-air (OTA) test. To address the performance gap between the simulation and the OTA test caused by the discrepancy between the channel model for offline training and real environments, we develop a novel online training strategy, called SwitchNet receiver. The SwitchNet receiver is with a flexible and extendable architecture and can adapts to real channel by training one parameter online. The OTA test verifies its feasibility and robustness to real environments and indicates its potential for future communications systems. At the end of this paper, we discuss some challenges to inspire future research.Comment: 29 pages, 13 figures, submitted to IEEE Journal on Selected Areas in Communication

    Optimal Precoders for Tracking the AoD and AoA of a mm-Wave Path

    Get PDF
    In millimeter-wave channels, most of the received energy is carried by a few paths. Traditional precoders sweep the angle-of-departure (AoD) and angle-of-arrival (AoA) space with directional precoders to identify directions with largest power. Such precoders are heuristic and lead to sub-optimal AoD/AoA estimation. We derive optimal precoders, minimizing the Cram\'{e}r-Rao bound (CRB) of the AoD/AoA, assuming a fully digital architecture at the transmitter and spatial filtering of a single path. The precoders are found by solving a suitable convex optimization problem. We demonstrate that the accuracy can be improved by at least a factor of two over traditional precoders, and show that there is an optimal number of distinct precoders beyond which the CRB does not improve.Comment: Resubmission to IEEE Trans. on Signal Processing. 12 pages and 9 figure

    Performance enhancement for LTE and beyond systems

    Get PDF
    A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of PhilosophyWireless communication systems have undergone fast development in recent years. Based on GSM/EDGE and UMTS/HSPA, the 3rd Generation Partnership Project (3GPP) specified the Long Term Evolution (LTE) standard to cope with rapidly increasing demands, including capacity, coverage, and data rate. To achieve this goal, several key techniques have been adopted by LTE, such as Multiple-Input and Multiple-Output (MIMO), Orthogonal Frequency-Division Multiplexing (OFDM), and heterogeneous network (HetNet). However, there are some inherent drawbacks regarding these techniques. Direct conversion architecture is adopted to provide a simple, low cost transmitter solution. The problem of I/Q imbalance arises due to the imperfection of circuit components; the orthogonality of OFDM is vulnerable to carrier frequency offset (CFO) and sampling frequency offset (SFO). The doubly selective channel can also severely deteriorate the receiver performance. In addition, the deployment of Heterogeneous Network (HetNet), which permits the co-existence of macro and pico cells, incurs inter-cell interference for cell edge users. The impact of these factors then results in significant degradation in relation to system performance. This dissertation aims to investigate the key techniques which can be used to mitigate the above problems. First, I/Q imbalance for the wideband transmitter is studied and a self-IQ-demodulation based compensation scheme for frequencydependent (FD) I/Q imbalance is proposed. This combats the FD I/Q imbalance by using the internal diode of the transmitter and a specially designed test signal without any external calibration instruments or internal low-IF feedback path. The instrument test results show that the proposed scheme can enhance signal quality by 10 dB in terms of image rejection ratio (IRR). In addition to the I/Q imbalance, the system suffers from CFO, SFO and frequency-time selective channel. To mitigate this, a hybrid optimum OFDM receiver with decision feedback equalizer (DFE) to cope with the CFO, SFO and doubly selective channel. The algorithm firstly estimates the CFO and channel frequency response (CFR) in the coarse estimation, with the help of hybrid classical timing and frequency synchronization algorithms. Afterwards, a pilot-aided polynomial interpolation channel estimation, combined with a low complexity DFE scheme, based on minimum mean squared error (MMSE) criteria, is developed to alleviate the impact of the residual SFO, CFO, and Doppler effect. A subspace-based signal-to-noise ratio (SNR) estimation algorithm is proposed to estimate the SNR in the doubly selective channel. This provides prior knowledge for MMSE-DFE and automatic modulation and coding (AMC). Simulation results show that this proposed estimation algorithm significantly improves the system performance. In order to speed up algorithm verification process, an FPGA based co-simulation is developed. Inter-cell interference caused by the co-existence of macro and pico cells has a big impact on system performance. Although an almost blank subframe (ABS) is proposed to mitigate this problem, the residual control signal in the ABS still inevitably causes interference. Hence, a cell-specific reference signal (CRS) interference cancellation algorithm, utilizing the information in the ABS, is proposed. First, the timing and carrier frequency offset of the interference signal is compensated by utilizing the cross-correlation properties of the synchronization signal. Afterwards, the reference signal is generated locally and channel response is estimated by making use of channel statistics. Then, the interference signal is reconstructed based on the previous estimate of the channel, timing and carrier frequency offset. The interference is mitigated by subtracting the estimation of the interference signal and LLR puncturing. The block error rate (BLER) performance of the signal is notably improved by this algorithm, according to the simulation results of different channel scenarios. The proposed techniques provide low cost, low complexity solutions for LTE and beyond systems. The simulation and measurements show good overall system performance can be achieved

    A Channel Ranking And Selection Scheme Based On Channel Occupancy And SNR For Cognitive Radio Systems

    Get PDF
    Wireless networks and information traffic have grown exponentially over the last decade. Consequently, an increase in demand for radio spectrum frequency bandwidth has resulted. Recent studies have shown that with the current fixed spectrum allocation (FSA), radio frequency band utilization ranges from 15% to 85%. Therefore, there are spectrum holes that are not utilized all the time by the licensed users, and, thus the radio spectrum is inefficiently exploited. To solve the problem of scarcity and inefficient utilization of the spectrum resources, dynamic spectrum access has been proposed as a solution to enable sharing and using available frequency channels. With dynamic spectrum allocation (DSA), unlicensed users can access and use licensed, available channels when primary users are not transmitting. Cognitive Radio technology is one of the next generation technologies that will allow efficient utilization of spectrum resources by enabling DSA. However, dynamic spectrum allocation by a cognitive radio system comes with the challenges of accurately detecting and selecting the best channel based on the channelâs availability and quality of service. Therefore, the spectrum sensing and analysis processes of a cognitive radio system are essential to make accurate decisions. Different spectrum sensing techniques and channel selection schemes have been proposed. However, these techniques only consider the spectrum occupancy rate for selecting the best channel, which can lead to erroneous decisions. Other communication parameters, such as the Signal-to-Noise Ratio (SNR) should also be taken into account. Therefore, the spectrum decision-making process of a cognitive radio system must use techniques that consider spectrum occupancy and channel quality metrics to rank channels and select the best option. This thesis aims to develop a utility function based on spectrum occupancy and SNR measurements to model and rank the sensed channels. An evolutionary algorithm-based SNR estimation technique was developed, which enables adaptively varying key parameters of the existing Eigenvalue-based blind SNR estimation technique. The performance of the improved technique is compared to the existing technique. Results show the evolutionary algorithm-based estimation performing better than the existing technique. The utility-based channel ranking technique was developed by first defining channel utility function that takes into account SNR and spectrum occupancy. Different mathematical functions were investigated to appropriately model the utility of SNR and spectrum occupancy rate. A ranking table is provided with the utility values of the sensed channels and compared with the usual occupancy rate based channel ranking. According to the results, utility-based channel ranking provides a better scope of making an informed decision by considering both channel occupancy rate and SNR. In addition, the efficiency of several noise cancellation techniques was investigated. These techniques can be employed to get rid of the impact of noise on the received or sensed signals during spectrum sensing process of a cognitive radio system. Performance evaluation of these techniques was done using simulations and the results show that the evolutionary algorithm-based noise cancellation techniques, particle swarm optimization and genetic algorithm perform better than the regular gradient descent based technique, which is the least-mean-square algorithm
    • …
    corecore