6 research outputs found

    Theoretical Analysis and Performance Comparison of multi-carrier Waveforms for 5G Wireless Applications

    Get PDF
    5G wireless technology is a new wireless communication system that must meet different complementary needs: high data rate for mobile services, low energy consumption and long-range for connected objects, low latency to ensure real-time communication for critical applications and high spectral efficiency to improve the overall system capacity. The waveforms and associated signals processing, present a real challenge in the implementation for each generation of wireless communication networks. This paper presents the diverse waveforms candidate for 5G systems, including: CE-OFDM (Constant Envelope OFDM), Filter-Bank Multi Carrier (FBMC), Universal Filtered Multi-Carrier (UFMC) and Filtered OFDM (F-OFDM). In this work, simulations are carried out in order to compare the performance of the OFDM, CE-OFDM, F-OFDM, UFMC and FBMC in terms of Power spectral density (PSD) and of Bit Error Rate (BER). It has been demonstrated that (CE-OFDM), constitutes a more efficient solution in terms of energy consumption than OFDM signal. Moreover, the (F-OFDM), (UFMC) and (FBMC) could constitute a more efficient solution in terms of power spectral density, spectral efficiency and bit error rates. In fact, CE-OFDM reduces the Peak to Average Power Ratio (PAPR) associated with OFDM system, FBMC is a method of improving out-of-band (OOB) characteristic by filtering each subcarrier and resisting the inter-carrier interference (ICI). While, UFMC offers a high spectral efficiency compared to OFDM

    THROUGHPUT OPTIMIZATION AND ENERGY EFFICIENCY OF THE DOWNLINK IN THE LTE SYSTEM

    Get PDF
    Nowadays, the usage of smart phones is very popular. More and more people access the Internet with their smart phones. This demands higher data rates from the mobile network operators. Every year the number of users and the amount of information is increasing dramatically. The wireless technology should ensure high data rates to be able to compete with the wire-based technology. The main advantage of the wireless system is the ability for user to be mobile. The 4G LTE system made it possible to gain very high peak data rates. The purpose of this thesis was to investigate the improvement of the system performance for the downlink based on different antenna configurations and different scheduling algorithms. Moreover, the fairness between the users using different schedulers has been analyzed and evaluated. Furthermore, the energy efficiency of the scheduling algorithms in the downlink of LTE systems has been considered. Some important parts of the LTE system are described in the theoretical part of this thesis.fi=Opinnäytetyö kokotekstinä PDF-muodossa.|en=Thesis fulltext in PDF format.|sv=Lärdomsprov tillgängligt som fulltext i PDF-format

    Technologies to improve the performance of wireless sensor networks in high-traffic applications

    Get PDF
    The expansion of wireless sensor networks to advanced areas, including structure health monitoring, multimedia surveillance, and health care monitoring applications, has resulted in new and complex problems. Traditional sensor systems are designed and optimised for extremely low traffic loads. However, it has been witnessed that network performance drops rapidly with the higher traffic loads common in advanced applications. In this thesis, we examine the system characteristics and new system requirements of these advanced sensor network applications. Based on this analysis, we propose an improved architecture for wireless sensor systems to increase the network performance while maintaining compatibility with the essential WSN requirements: low power, low cost, and distributed scalability. We propose a modified architecture deriving from the IEEE 802.15.4 standard, which is shown to significantly increase the network performance in applications generating increased data loads. This is achieved by introducing the possibility of independently allocating the sub-carriers in a distributed manner. As a result, the overall efficiency of the channel contention mechanism will be increased to deliver higher throughput with lower energy consumption. Additionally, we develop the concept of increasing the data transmission efficiency by adapting the spreading code length to the wireless environment. Such a modification will not only be able to deliver higher throughput but also maintain a reliable wireless link in the harsh RF environment. Finally, we propose the use of the battery recovery effect to increase the power efficiency of the system under heavy traffic load conditions. These three innovations minimise the contention window period while maximising the capacity of the available channel, which is shown to increase network performance in terms of energy efficiency, throughput and latency. The proposed system is shown to be backwards compatible and able to satisfy both traditional and advanced applications and is particularly suitable for deployment in harsh RF environments. Experiments and analytic techniques have been described and developed to produce performance metrics for all the proposed techniques

    Modeling and Digital Mitigation of Transmitter Imperfections in Radio Communication Systems

    Get PDF
    To satisfy the continuously growing demands for higher data rates, modern radio communication systems employ larger bandwidths and more complex waveforms. Furthermore, radio devices are expected to support a rich mixture of standards such as cellular networks, wireless local-area networks, wireless personal area networks, positioning and navigation systems, etc. In general, a "smart'' device should be flexible to support all these requirements while being portable, cheap, and energy efficient. These seemingly conflicting expectations impose stringent radio frequency (RF) design challenges which, in turn, call for their proper understanding as well as developing cost-effective solutions to address them. The direct-conversion transceiver architecture is an appealing analog front-end for flexible and multi-standard radio systems. However, it is sensitive to various circuit impairments, and modern communication systems based on multi-carrier waveforms such as Orthogonal Frequency Division Multiplexing (OFDM) and Orthogonal Frequency Division Multiple Access (OFDMA) are particularly vulnerable to RF front-end non-idealities.This thesis addresses the modeling and digital mitigation of selected transmitter (TX) RF impairments in radio communication devices. The contributions can be divided into two areas. First, new modeling and digital mitigation techniques are proposed for two essential front-end impairments in direct-conversion architecture-based OFDM and OFDMA systems, namely inphase and quadrature phase (I/Q) imbalance and carrier frequency offset (CFO). Both joint and de-coupled estimation and compensation schemes for frequency-selective TX I/Q imbalance and channel distortions are proposed for OFDM systems, to be adopted on the receiver side. Then, in the context of uplink OFDMA and Single Carrier FDMA (SC-FDMA), which are the air interface technologies of the 3rd Generation Partnership Project (3GPP) Long Term Evolution (LTE) and LTE-Advanced systems, joint estimation and equalization techniques of RF impairments and channel distortions are proposed. Here, the challenging multi-user uplink scenario with unequal received power levels is investigated where I/Q imbalance causes inter-user interference. A joint mirror subcarrier processing-based minimum mean-square error (MMSE) equalizer with an arbitrary number of receiver antennas is formulated to effectively handle the mirror sub-band users of different power levels. Furthermore, the joint channel and impairments filter responses are efficiently approximated with polynomial-based basis function models, and the parameters of basis functions are estimated with the reference signals conforming to the LTE uplink sub-frame structure. The resulting receiver concept adopting the proposed techniques enables improved link performance without modifying the design of RF transceivers.Second, digital baseband mitigation solutions are developed for the TX leakage signal-induced self-interference in frequency division duplex (FDD) transceivers. In FDD transceivers, a duplexer is used to connect the TX and receiver (RX) chains to a common antenna while also providing isolation to the receiver chain against the powerful transmit signal. In general, the continuous miniaturization of hardware and adoption of larger bandwidths through carrier aggregation type noncontiguous allocations complicates achieving sufficient TX-RX isolation. Here, two different effects of the transmitter leakage signal are investigated. The first is TX out-of-band (OOB) emissions and TX spurious emissions at own receiver band, due to the transmitter nonlinearity, and the second is nonlinearity of down-converter in the RX that generates second-order intermodulation distortion (IMD2) due to the TX in-band leakage signal. This work shows that the transmitter leakage signal-induced interference depends on an equivalent leakage channel that models the TX path non-idealities, duplexer filter responses, and the RX path non-idealities. The work proposes algorithms that operate in the digital baseband of the transceiver to estimate the TX-RX non-idealities and the duplexer filter responses, and subsequently regenerating and canceling the self-interference, thereby potentially relaxing the TX-RX isolation requirements as well as increasing the transceiver flexibility.Overall, this thesis provides useful signal models to understand the implications of different RF non-idealities and proposes compensation solutions to cope with certain RF impairments. This is complemented with extensive computer simulations and practical RF measurements to validate their application in real-world radio transceivers

    Handover management strategies in LTE-advanced heterogeneous networks.

    Get PDF
    Doctoral Degree. University of KwaZulu-Natal, Durban.Meeting the increasing demand for data due to the proliferation of high-specification mobile devices in the cellular systems has led to the improvement of the Long Term Evolution (LTE) framework to the LTE-Advanced systems. Different aspects such as Massive Multiple-Input Multiple Output (MIMO), Orthogonal Frequency Division Multiple Access (OFDMA), heterogeneous networks and Carrier Aggregation have been considered in the LTE-Advanced to improve the performance of the system. The small cells like the femtocells and the relays play a significant role in increasing the coverage and the capacity of the mobile cellular networks in LTE-Advanced (LTE-A) heterogeneous network. However, the user equipment (UE) are faced with the frequent handover problems in the heterogeneous systems than the homogeneous systems due to the users‟ mobility and densely populated cells. The objective of this research work is to analyse the handover performance in the current LTE/LTE-A network and to propose various handover management strategies to handle the frequent handover problems in the LTE-Advance heterogeneous networks. To achieve this, an event driven simulator using C# was developed based on the 3GPP LTE/LTE-A standard to evaluate the proposed strategies. To start with, admission control which is a major requirement during the handover initiation stage is discussed and this research work has therefore proposed a channel borrowing admission control scheme for the LTE-A networks. With this scheme in place, resources are better utilized and more calls are accepted than in the conventional schemes where the channel borrowing is not applied. Also proposed is an enhanced strategy for the handover management in two-tier femtocell-macrocell networks. The proposed strategy takes into consideration the speed of user and other parameters in other to effectively reduce the frequent and unnecessary handovers, and as well as the ratio of target femtocells in the system. We also consider scenarios such as the one that dominate the future networks where femtocells will be densely populated to handle very heavy traffic. To achieve this, a Call Admission Control (CAC)-based handover management strategy is proposed to manage the handover in dense femtocell-macrocell integration in the LTE-A network. The handover probability, the handover call dropping probability and the call blocking probability are reduced considerably with the proposed strategy. Finally, the handover management for the mobile relays in a moving vehicle is considered (using train as a case study). We propose a group handover strategy where the Mobile Relay Node (MRN) is integrated with a special mobile device called “mdev” to prepare the group information prior to the handover time. This is done to prepare the UE‟s group information and services for timely handover due to the speed of the train. This strategy reduces the number of handovers and the call dropping probability in the moving vehicle.Publications and conferences listed on page iv-v
    corecore