141 research outputs found

    Timing and Carrier Synchronization in Wireless Communication Systems: A Survey and Classification of Research in the Last 5 Years

    Get PDF
    Timing and carrier synchronization is a fundamental requirement for any wireless communication system to work properly. Timing synchronization is the process by which a receiver node determines the correct instants of time at which to sample the incoming signal. Carrier synchronization is the process by which a receiver adapts the frequency and phase of its local carrier oscillator with those of the received signal. In this paper, we survey the literature over the last 5 years (2010–2014) and present a comprehensive literature review and classification of the recent research progress in achieving timing and carrier synchronization in single-input single-output (SISO), multiple-input multiple-output (MIMO), cooperative relaying, and multiuser/multicell interference networks. Considering both single-carrier and multi-carrier communication systems, we survey and categorize the timing and carrier synchronization techniques proposed for the different communication systems focusing on the system model assumptions for synchronization, the synchronization challenges, and the state-of-the-art synchronization solutions and their limitations. Finally, we envision some future research directions

    Modeling and Digital Mitigation of Transmitter Imperfections in Radio Communication Systems

    Get PDF
    To satisfy the continuously growing demands for higher data rates, modern radio communication systems employ larger bandwidths and more complex waveforms. Furthermore, radio devices are expected to support a rich mixture of standards such as cellular networks, wireless local-area networks, wireless personal area networks, positioning and navigation systems, etc. In general, a "smart'' device should be flexible to support all these requirements while being portable, cheap, and energy efficient. These seemingly conflicting expectations impose stringent radio frequency (RF) design challenges which, in turn, call for their proper understanding as well as developing cost-effective solutions to address them. The direct-conversion transceiver architecture is an appealing analog front-end for flexible and multi-standard radio systems. However, it is sensitive to various circuit impairments, and modern communication systems based on multi-carrier waveforms such as Orthogonal Frequency Division Multiplexing (OFDM) and Orthogonal Frequency Division Multiple Access (OFDMA) are particularly vulnerable to RF front-end non-idealities.This thesis addresses the modeling and digital mitigation of selected transmitter (TX) RF impairments in radio communication devices. The contributions can be divided into two areas. First, new modeling and digital mitigation techniques are proposed for two essential front-end impairments in direct-conversion architecture-based OFDM and OFDMA systems, namely inphase and quadrature phase (I/Q) imbalance and carrier frequency offset (CFO). Both joint and de-coupled estimation and compensation schemes for frequency-selective TX I/Q imbalance and channel distortions are proposed for OFDM systems, to be adopted on the receiver side. Then, in the context of uplink OFDMA and Single Carrier FDMA (SC-FDMA), which are the air interface technologies of the 3rd Generation Partnership Project (3GPP) Long Term Evolution (LTE) and LTE-Advanced systems, joint estimation and equalization techniques of RF impairments and channel distortions are proposed. Here, the challenging multi-user uplink scenario with unequal received power levels is investigated where I/Q imbalance causes inter-user interference. A joint mirror subcarrier processing-based minimum mean-square error (MMSE) equalizer with an arbitrary number of receiver antennas is formulated to effectively handle the mirror sub-band users of different power levels. Furthermore, the joint channel and impairments filter responses are efficiently approximated with polynomial-based basis function models, and the parameters of basis functions are estimated with the reference signals conforming to the LTE uplink sub-frame structure. The resulting receiver concept adopting the proposed techniques enables improved link performance without modifying the design of RF transceivers.Second, digital baseband mitigation solutions are developed for the TX leakage signal-induced self-interference in frequency division duplex (FDD) transceivers. In FDD transceivers, a duplexer is used to connect the TX and receiver (RX) chains to a common antenna while also providing isolation to the receiver chain against the powerful transmit signal. In general, the continuous miniaturization of hardware and adoption of larger bandwidths through carrier aggregation type noncontiguous allocations complicates achieving sufficient TX-RX isolation. Here, two different effects of the transmitter leakage signal are investigated. The first is TX out-of-band (OOB) emissions and TX spurious emissions at own receiver band, due to the transmitter nonlinearity, and the second is nonlinearity of down-converter in the RX that generates second-order intermodulation distortion (IMD2) due to the TX in-band leakage signal. This work shows that the transmitter leakage signal-induced interference depends on an equivalent leakage channel that models the TX path non-idealities, duplexer filter responses, and the RX path non-idealities. The work proposes algorithms that operate in the digital baseband of the transceiver to estimate the TX-RX non-idealities and the duplexer filter responses, and subsequently regenerating and canceling the self-interference, thereby potentially relaxing the TX-RX isolation requirements as well as increasing the transceiver flexibility.Overall, this thesis provides useful signal models to understand the implications of different RF non-idealities and proposes compensation solutions to cope with certain RF impairments. This is complemented with extensive computer simulations and practical RF measurements to validate their application in real-world radio transceivers

    Long Term Evolution-Advanced and Future Machine-to-Machine Communication

    Get PDF
    Long Term Evolution (LTE) has adopted Orthogonal Frequency Division Multiple Access (OFDMA) and Single Carrier Frequency Division Multiple Access (SC-FDMA) as the downlink and uplink transmission schemes respectively. Quality of Service (QoS) provisioning is one of the primary objectives of wireless network operators. In LTE-Advanced (LTE-A), several additional new features such as Carrier Aggregation (CA) and Relay Nodes (RNs) have been introduced by the 3rd Generation Partnership Project (3GPP). These features have been designed to deal with the ever increasing demands for higher data rates and spectral efficiency. The RN is a low power and low cost device designed for extending the coverage and enhancing spectral efficiency, especially at the cell edge. Wireless networks are facing a new challenge emerging on the horizon, the expected surge of the Machine-to-Machine (M2M) traffic in cellular and mobile networks. The costs and sizes of the M2M devices with integrated sensors, network interfaces and enhanced power capabilities have decreased significantly in recent years. Therefore, it is anticipated that M2M devices might outnumber conventional mobile devices in the near future. 3GPP standards like LTE-A have primarily been developed for broadband data services with mobility support. However, M2M applications are mostly based on narrowband traffic. These standards may not achieve overall spectrum and cost efficiency if they are utilized for serving the M2M applications. The main goal of this thesis is to take the advantage of the low cost, low power and small size of RNs for integrating M2M traffic into LTE-A networks. A new RN design is presented for aggregating and multiplexing M2M traffic at the RN before transmission over the air interface (Un interface) to the base station called eNodeB. The data packets of the M2M devices are sent to the RN over the Uu interface. Packets from different devices are aggregated at the Packet Data Convergence Protocol (PDCP) layer of the Donor eNodeB (DeNB) into a single large IP packet instead of several small IP packets. Therefore, the amount of overhead data can be significantly reduced. The proposed concept has been developed in the LTE-A network simulator to illustrate the benefits and advantages of the M2M traffic aggregation and multiplexing at the RN. The potential gains of RNs such as coverage enhancement, multiplexing gain, end-to-end delay performance etc. are illustrated with help of simulation results. The results indicate that the proposed concept improves the performance of the LTE-A network with M2M traffic. The adverse impact of M2M traffic on regular LTE-A traffic such as voice and file transfer is minimized. Furthermore, the cell edge throughput and QoS performance are enhanced. Moreover, the results are validated with the help of an analytical model

    Scheduling and Link Adaptation for Uplink SC-FDMA Systems - A LTE Case Study

    Get PDF

    Lightly synchronized Multipacket Reception in Machine-Type Communications Networks

    Get PDF
    Machine Type Communication (MTC) applications were designed to monitor and control elements of our surroundings and environment. MTC applications have a different set of requirements compared to the traditional communication devices, with Machine to Machine (M2M) data being mostly short, asynchronous, bursty and sometimes requiring end-to-end delays below 1ms. With the growth of MTC, the new generation of mobile communications has to be able to present different types of services with very different requirements, i.e. the same network has to be capable of "supplying" connection to the user that just wants to download a video or use social media, allowing at the same time MTC that has completely different requirements, without deteriorating both experiences. The challenges associated to the implementation of MTC require disruptive changes at the Physical (PHY) and Medium Access Control (MAC) layers, that lead to a better use of the spectrum available. The orthogonality and synchronization requirements of the PHY layer of current Long Term Evolution Advanced (LTE-A) radio access network (based on glsofdm and Single Carrier Frequency Domain Equalization (SC-FDE)) are obstacles for this new 5th Generation (5G) architecture. Generalized Frequency Division Multiplexing (GFDM) and other modulation techniques were proposed as candidates for the 5G PHY layer, however they also suffer from visible degradation when the transmitter and receiver are not synchronized, leading to a poor performance when collisions occur in an asynchronous MAC layer. This dissertation addresses the requirements of M2M traffic at the MAC layer applying multipacket reception (MPR) techniques to handle the bursty nature of the traffic and synchronization tones and optimized back-off approaches to reduce the delay. It proposes a new MAC protocol and analyses its performance analytically considering an SC-FDE modulation. The models are validated using a system level cross-layer simulator developed in MATLAB, which implements the MAC protocol and applies PHY layer performance models. The results show that the MAC’s latency depends mainly on the number of users and the load of each user, and can be controlled using these two parameters

    Multi-Cell Uplink Radio Resource Management. A LTE Case Study

    Get PDF

    Spectrally and Energy Efficient Wireless Communications: Signal and System Design, Mathematical Modelling and Optimisation

    Get PDF
    This thesis explores engineering studies and designs aiming to meeting the requirements of enhancing capacity and energy efficiency for next generation communication networks. Challenges of spectrum scarcity and energy constraints are addressed and new technologies are proposed, analytically investigated and examined. The thesis commences by reviewing studies on spectrally and energy-efficient techniques, with a special focus on non-orthogonal multicarrier modulation, particularly spectrally efficient frequency division multiplexing (SEFDM). Rigorous theoretical and mathematical modelling studies of SEFDM are presented. Moreover, to address the potential application of SEFDM under the 5th generation new radio (5G NR) heterogeneous numerologies, simulation-based studies of SEFDM coexisting with orthogonal frequency division multiplexing (OFDM) are conducted. New signal formats and corresponding transceiver structure are designed, using a Hilbert transform filter pair for shaping pulses. Detailed modelling and numerical investigations show that the proposed signal doubles spectral efficiency without performance degradation, with studies of two signal formats; uncoded narrow-band internet of things (NB-IoT) signals and unframed turbo coded multi-carrier signals. The thesis also considers using constellation shaping techniques and SEFDM for capacity enhancement in 5G system. Probabilistic shaping for SEFDM is proposed and modelled to show both transmission energy reduction and bandwidth saving with advantageous flexibility for data rate adaptation. Expanding on constellation shaping to improve performance further, a comparative study of multidimensional modulation techniques is carried out. A four-dimensional signal, with better noise immunity is investigated, for which metaheuristic optimisation algorithms are studied, developed, and conducted to optimise bit-to-symbol mapping. Finally, a specially designed machine learning technique for signal and system design in physical layer communications is proposed, utilising the application of autoencoder-based end-to-end learning. Multidimensional signal modulation with multidimensional constellation shaping is proposed and optimised by using machine learning techniques, demonstrating significant improvement in spectral and energy efficiencies

    An Uplink UE Group-Based Scheduling Technique for 5G mMTC Systems Over LEO Satellite

    Get PDF
    Narrowband Internet of Things (NB-IoT) is one of the most promising IoT technology to support the massive machine-type communication (mMTC) scenarios of the fifth generation mobile communication (5G). While the aim of this technology is to provide global coverage to the low-cost IoT devices distributed all over the globe, the vital role of satellites to complement and extend the terrestrial IoT network in remote or under-served areas has been recognized. In the context of having the global IoT networks, low earth (LEO) orbits would be beneficial due to their smaller propagation signal loss, which for the low complexity, low power, and cheap IoT devices is of utmost importance to close the link-budget. However, while this would lessen the problem of large delay and signal loss in the geostationary (GEO) orbit, it would come up with increased Doppler effects. In this paper, we propose an uplink scheduling technique for a LEO satellite-based mMTC NB-IoT system, able to mitigate the level of the differential Doppler down to a value tolerable by the IoT devices. The performance of the proposed strategy is validated through numerical simulations and the achievable data rates of the considered scenario are shown, in order to emphasize the limitations of such systems coming from the presence of a satellite channel
    • …
    corecore