165 research outputs found

    A Robust Maximum Likelihood Scheme for PSS Detection and Integer Frequency Offset Recovery in LTE Systems

    Get PDF
    Before establishing a communication link in a cellular network, the user terminal must activate a synchronization procedure called initial cell search in order to acquire specific information about the serving base station. To accomplish this task, the primary synchronization signal (PSS) and secondary synchronization signal (SSS) are periodically transmitted in the downlink of a long term evolution (LTE) network. Since SSS detection can be performed only after successful identification of the primary signal, in this work, we present a novel algorithm for joint PSS detection, sector index identification, and integer frequency offset (IFO) recovery in an LTE system. The proposed scheme relies on the maximum likelihood (ML) estimation criterion and exploits a suitable reduced-rank representation of the channel frequency response, which proves robust against multipath distortions and residual timing errors. We show that a number of PSS detection methods that were originally introduced through heuristic reasoning can be derived from our ML framework by simply selecting an appropriate model for the channel gains over the PSS subcarriers. Numerical simulations indicate that the proposed scheme can be effectively applied in the presence of severe multipath propagation, where existing alternatives provide unsatisfactory performance

    Low-power Physical-layer Design for LTE Based Very NarrowBand IoT (VNB - IoT) Communication

    Get PDF
    abstract: With the new age Internet of Things (IoT) revolution, there is a need to connect a wide range of devices with varying throughput and performance requirements. In this thesis, a wireless system is proposed which is targeted towards very low power, delay insensitive IoT applications with low throughput requirements. The low cost receivers for such devices will have very low complexity, consume very less power and hence will run for several years. Long Term Evolution (LTE) is a standard developed and administered by 3rd Generation Partnership Project (3GPP) for high speed wireless communications for mobile devices. As a part of Release 13, another standard called narrowband IoT (NB-IoT) was introduced by 3GPP to serve the needs of IoT applications with low throughput requirements. Working along similar lines, this thesis proposes yet another LTE based solution called very narrowband IoT (VNB-IoT), which further reduces the complexity and power consumption of the user equipment (UE) while maintaining the base station (BS) architecture as defined in NB-IoT. In the downlink operation, the transmitter of the proposed system uses the NB-IoT resource block with each subcarrier modulated with data symbols intended for a different user. On the receiver side, each UE locks to a particular subcarrier frequency instead of the entire resource block and operates as a single carrier receiver. On the uplink, the system uses a single-tone transmission as specified in the NB-IoT standard. Performance of the proposed system is analyzed in an additive white Gaussian noise (AWGN) channel followed by an analysis of the inter carrier interference (ICI). Relationship between the overall filter bandwidth and ICI is established towards the end.Dissertation/ThesisMasters Thesis Electrical Engineering 201

    Observed time difference of arrival based position estimation for LTE systems: simulation framework and performance evaluation

    Get PDF
    Precise user equipment (UE) location is paramount for the reliable operation of location-based services provided by mobile network operators and other emerging applications. In this paper, the Long Term Evolution (LTE) network positioning performance based on mobile assist Observed Time Difference of Arrival (OTDoA) method is considered. The received signal time difference (RSTD) measurements are estimated by the UE using dedicated position reference signal (PRS) transmitted in the downlink frame where the reported time measurements are used by the network for location calculation. A simulation framework for the position estimation in LTE networks is presented where the LTE downlink communication link is implemented. The correlation-based method for the time of arrival measurement is used for the implementation of OTDoA. The simulation framework provides different configurations and adjustments for the system and network parameters for evaluating the performance of LTE positioning using OTDoA over multipath fading channels. Different simulation scenarios are conducted to identify the influence of various parameters of LTE system and positioning procedure setup on the positioning accuracy. Simulation results demonstrated that the positioning accuracy is highly affected by the channel fading condition where the accuracy of time of arrival measurements is deteriorated in severe fading environments; however, the positioning accuracy can be significantly improved by increasing the positioning sequences involved in the estimation process either in the frequency domain or in the time domain

    Mobility Analysis and Management for Heterogeneous Networks

    Get PDF
    The global mobile data traffic has increased tremendously in the last decade due to the technological advancement in smartphones. Their endless usage and bandwidth-intensive applications will saturate current 4G technologies and has motivated the need for concrete research in order to sustain the mounting data traffic demand. In this regard, the network densification has shown to be a promising direction to cope with the capacity demands in future 5G wireless networks. The basic idea is to deploy several low power radio access nodes called small cells closer to the users on the existing large radio foot print of macrocells, and this constitutes a heterogeneous network (HetNet). However, there are many challenges that operators face with the dense HetNet deployment. The mobility management becomes a challenging task due to triggering of frequent handovers when a user moves across the network coverage areas. When there are fewer users associated in certain small cells, this can lead to significant increase in the energy consumption. Intelligently switching them to low energy consumption modes or turning them off without seriously degrading user performance is desirable in order to improve the energy savings in HetNets. This dynamic power level switching in the small cells, however, may cause unnecessary handovers, and it becomes important to ensure energy savings without compromising handover performance. Finally, it is important to evaluate mobility management schemes in real network deployments, in order to find any problems affecting the quality of service (QoS) of the users. The research presented in this dissertation aims to address these challenges. First, to tackle the mobility management issue, we develop a closed form, analytical model to study the handover and ping-pong performance as a function of network parameters in the small cells, and verify its performance using simulations. Secondly, we incorporate fuzzy logic based game-theoretic framework to address and examine the energy efficiency improvements in HetNets. In addition, we design fuzzy inference rules for handover decisions and target base station selection is performed through a fuzzy ranking technique in order to enhance the mobility robustness, while also considering energy/spectral efficiency. Finally, we evaluate the mobility performance by carrying out drive test in an existing 4G long term evolution (LTE) network deployment using software defined radios (SDR). This helps to obtain network quality information in order to find any problems affecting the QoS of the users
    • …
    corecore