366 research outputs found

    Técnicas de igualização adaptativas com estimativas imperfeitas do canal para os futuros sistemas 5G

    Get PDF
    Wireless communication networks have been continuously experiencing an exponential growth since their inception. The overwhelming demand for high data rates, support of a large number of users while mitigating disruptive interference are the constant research focus and it has led to the creation of new technologies and efficient techniques. Orthogonal frequency division multiplexing (OFDM) is the most common example of a technology that has come to the fore in this past decade as it provided a simple and generally ideal platform for wireless data transmission. It’s drawback of a rather high peak-to-average power ratio (PAPR) and sensitivity to phase noise, which in turn led to the adoption of alternative techniques, such as the single carrier systems with frequency domain equalization (SC-FDE) or the multi carrier systems with code division multiple access (MC-CDMA), but the nonlinear Frequency Domain Equalizers (FDE) have been of special note due to their improved performance. From these, the Iterative Block Decision Feedback Equalizer (IB-DFE) has proven itself especially promising due to its compatibility with space diversity, MIMO systems and CDMA schemes. However, the IB-DFE requires the system to have constant knowledge of the communication channel properties, that is, to have constantly perfect Channel State Information (CSI), which is both unrealistic and impractical to implement. In this dissertation we shall design an altered IB-DFE receiver that is able to properly detect signals from SC-FDMA based transmitters, even with constantly erroneous channel states. The results shall demonstrate that the proposed equalization scheme is robust to imperfect CSI (I-CSI) situations, since its performance is constantly close to the perfect CSI case, within just a few iterations.Redes sem fios têm crescido de maneira contínua e exponencial desde a sua incepção. A tremenda exigência para altas taxas de dados e o suporte para um elevado número de utilizadores sem aumentar a interferência disruptiva originada por estes são alguns dos focos que levaram ao desenvolvimento de técnicas de compensação e novas tecnologias. “Orthogonal frequency division multiplexing” (OFDM) é um dos exemplos de tecnologias que se destacaram nesta última década, visto ter fornecido uma plataforma para transmissão de dados sem-fio eficaz e simples. O seu maior problema é a alta “peak-to-average power ratio” (PAPR) e a sua sensibilidade a ruído de fase que deram motivo à adoção de técnicas alternativas, tais como os sistemas “single carrier” com “frequency domain equalization” (SC-FDE) ou os sistemas “multi-carrier” com “code division multiple access” (MC-CDMA), mas equalizadores não lineares no domínio de frequência têm sido alvo de especial atenção devido ao seu melhor desempenho. Destes, o “iterative block decision feedback equalizer” (IB-DFE) tem-se provado especialmente promissor devido à sua compatibilidade com técnicas de diversidade no espaço, sistemas MIMO e esquemas CDMA. No entanto, IB-DFE requer que o sistema tenha constante conhecimento das propriedades dos canais usados, ou seja, necessita de ter perfeito “channel state information” (CSI) constantemente, o que é tanto irrealista como impossível de implementar. Nesta dissertação iremos projetar um recetor IB-DFE alterado de forma a conseguir detetar sinais dum transmissor baseado em tecnologia SC-FDMA, mesmo com a informação de estado de canal errada. Os resultados irão então demonstrar que o novo esquema de equalização proposto é robusto para situações de CSI imperfeito (I-CSI), visto que o seu desempenho se mantém próximo dos valores esperados para CSI perfeito, em apenas algumas iterações.Mestrado em Engenharia Eletrónica e Telecomunicaçõe

    LTE-verkon suorituskyvyn parantaminen CDMA2000:sta LTE:hen tehdyn muutoksen jälkeen

    Get PDF
    CDMA2000 technology has been widely used on 450 MHz band. Recently the equipment availability and improved performance offered by LTE has started driving the operators to migrate their networks from CDMA2000 to LTE. The migration may cause the network performance to be in suboptimal state. This thesis presents four methods to positively influence LTE network performance after CDMA2000 to LTE migration, especially on 450 MHz band. Furthermore, three of the four presented methods are evaluated in a live network. The measured three methods were cyclic prefix length, handover parameter optimization and uplink coordinated multipoint (CoMP) transmission. The objective was to determine the effectiveness of each method. The research methods included field measurements and network KPI collection. The results show that normal cyclic prefix length is enough for LTE450 although the cell radius may be up to 50km. Only special cases exist where cyclic prefix should be extended. Operators should consider solving such problems individually instead of widely implementing extended cyclic prefix. Handover parameter optimization turned out to be an important point of attention after CDMA2000 to LTE migration. It was observed that if the handover parameters are not concerned, significant amount of unnecessary handovers may happen. It was evaluated that about 50% of the handovers in the network were unnecessary in the initial situation. By adjusting the handover parameter values 47,28 % of the handovers per user were removed and no negative effects were detected. Coordinated multipoint transmission has been widely discussed to be an effective way to improve LTE network performance, especially at the cell edges. Many challenges must be overcome before it can be applied to downlink. Also, implementing it to function between cells in different eNBs involve challenges. Thus, only intra-site uplink CoMP transmission was tested. The results show that the performance improvements were significant at the cell edges as theory predicted.CDMA2000 teknologiaa on laajalti käytetty 450 MHz:n taajuusalueella. Viime aikoina LTE:n tarjoamat halvemmat laitteistot ja parempi suorituskyky ovat kannustaneet operaattoreita muuttamaan verkkoaan CDMA2000:sta LTE:hen. Kyseinen muutos saattaa johtaa epäoptimaaliseen tilaan verkon suorituskyvyn kannalta. Tämä työ esittelee neljä menetelmää, joilla voidaan positiivisesti vaikuttaa LTE-verkon suorituskykyyn CDMA2000:ste LTE:hen tehdyn muutoksen jälkeen erityisesti 450 MHz:n taajuusalueella. Kolmea näistä menetelmistä arvioidaan tuotantoverkossa. Nämä kolme menetelmää ovat suojavälin pituus, solunvaihtoparametrien optimointi ja ylälinkin koordinoitu monipistetiedonsiirto. Tavoite oli määrittää kunkin menetelmän vaikutus. Tutkimusmenetelmiin kuului kenttämittaukset ja verkon suorituskykymittareiden analyysi. Tutkimustulosten perusteella voidaan sanoa, että normaali suojaväli on riittävän pitkä LTE450:lle vaikka solujen säde on jopa 50km. Vain erikoistapauksissa tarvitaan pidennettyä suojaväliä. Operaattoreiden tulisi ratkaista tällaiset tapaukset yksilöllisesti sen sijaan, että koko verkossa käytettäisiin pidennettyä suojaväliä. Solunvaihtoparametrien optimointi osoittautui tärkeäksi huomion aiheeksi CDMA2000:sta LTE:hen tehdyn muutoksen jälkeen. Turhia solunvaihtoja saattaa tapahtua merkittäviä määriä, mikäli parametreihin ei kiinnitetä huomiota. Lähtötilanteessa noin 50 % testiverkon solunvaihdoista arvioitiin olevan turhia. Solunvaihtoparametreja muuttamalla 47,28 % solunvaihdoista per käyttäjä saatiin poistettua ilman, että mitään haittavaikutuksia olisi huomattu. Koordinoidun monipistetiedonsiirron on laajalti sanottu olevan tehokas tapa parantaa LTE-verkon suorituskykyä, etenkin solujen reunoilla. Monia haasteita pitää ratkaista, enne kuin sitä voidaan käyttää alalinkin tiedonsiirtoon. Lisäksi sen käyttöön eri tukiasemien solujen välillä liittyy haasteita. Tästä syystä monipistetiedonsiirtoa voitiin testata vain ylälinkin suuntaan ja vain yhden tukiaseman välisten solujen kesken. Tulokset osoittivat, että suorituskyky parani merkittävästi solun reunalla

    IMPLEMENTATION AND PERFORMANCE ANALYSIS OF LONG TERM EVOLUTION USING SOFTWARE DEFINED RADIO

    Get PDF
    The overwhelming changes in the field of communication brought about need for high data rates, which led to the development of a system known as Long Term Evolution (LTE). LTE made good use of Orthogonal Frequency Division Multiplexing Access (OFDMA) in its downlink and Single Carrier Frequency Division Multiplexing Access (SCFDMA) in its uplink transmission because of their robust performance. These multiple access techniques are the major focus of study in this thesis, with their implementation in the LTE system. GNU Radio is a software Defined Radio (SDR) platform. It comprises of C++ signal processing libraries. For user simplicity, it has graphical user interface (GUI) known as GNU Radio Companion (GRC), to build a signal processing flow graph. GRC translates any specific task flow graph to a python program which calls inbuiltC++ signal processing blocks. By leveraging this feature and existing modules in GRC, OFDMA and SCFDMA is implemented. In this study we made use of existing OFDMA flow graph of GNU Radio to study the behavior of downlink and general performing SCFDMA system was implemented with some modifications of the existing GNU Radio blocks. With the GNU Radio implementation, we tested the working mechanism of both the systems. OFDMA is used in downlink for achieving high spectral efficiency and SCFDMA was introduced in uplink due to its low PAPR feature. These multiple access schemes have to meet the requirement of high throughput with low BER and PAPR, low delays and low complexity. In this thesis we are focused on evaluating these multiple access techniques in terms of BER and PAPR with modulation techniques like QPSK, 16-QAM and 64-QAM. Performance analysis part is performed in MATLAB

    PROCESS FOR BREAKING DOWN THE LTE SIGNAL TO EXTRACT KEY INFORMATION

    Get PDF
    The increasingly important role of Long Term Evolution (LTE) has increased security concerns among the service providers and end users and made security of the network even more indispensable. The main thrust of this thesis is to investigate if the LTE signal can be broken down in a methodical way to obtain information that would otherwise be private; e.g., the Global Positioning System (GPS) location of the user equipment/base station or identity (ID) of the user. The study made use of signal simulators and software to analyze the LTE signal to develop a method to remove noise, breakdown the LTE signal and extract desired information. From the simulation results, it was possible to extract key information in the downlink like the Downlink Control Information (DCI), Cell-Radio Network Temporary Identifier (C-RNTI) and physical Cell Identity (Cell-ID). This information can be modified to cause service disruptions in the network within a reasonable amount of time and with modest computing resources.Defence Science and Technology Agency, SingaporeApproved for public release; distribution is unlimited

    PAPR In LTE UPLINK : Problem and Improvement

    Get PDF
    LTE-Advanced is one of the most competing and widely adopted families of standards that will meet the 4G broadband wireless mobile communications requirements recommended by the IMT-Advanced for the terrestrial radio interface specifications. Pre-commercial deployments have proved that LTE-Advanced will ensure the competitiveness of the 4G mobile networks by providing a high-data-rate , low latency and optimized system. Unlike the IEEE802.16m WiMAX which uses OFDMA in both downlink and uplink multiple access schemes, LTE and its advanced version systems continue to use different multiple access transmissions in which OFDMA and SC-FDMA are supported in the downlink and the uplink, respectively. The idea to use OFDMA in the LTE uplink communications invoked discord among the members of the 3GPP standardization body because of the growing concern over the signal peakiness which degrades the efficiency of mobile station power battery consumption. The dire consequence of the peak amplitudes generated by the superposition of several subcarriers of identical phases led 3GPP to adopt SC-FDMA as an uplink multiple access method. Thus in this paper , the effect of pulse shaping on the performance of the uplink PAPR of distributed FDMA and localized FDMA will be dealt deeply. The performance improvement will be done by varying the roll-off factor of the raised-cosine filter for pulse shaping after IFFT.fi=Opinnäytetyö kokotekstinä PDF-muodossa.|en=Thesis fulltext in PDF format.|sv=Lärdomsprov tillgängligt som fulltext i PDF-format

    Timing and Carrier Synchronization in Wireless Communication Systems: A Survey and Classification of Research in the Last 5 Years

    Get PDF
    Timing and carrier synchronization is a fundamental requirement for any wireless communication system to work properly. Timing synchronization is the process by which a receiver node determines the correct instants of time at which to sample the incoming signal. Carrier synchronization is the process by which a receiver adapts the frequency and phase of its local carrier oscillator with those of the received signal. In this paper, we survey the literature over the last 5 years (2010–2014) and present a comprehensive literature review and classification of the recent research progress in achieving timing and carrier synchronization in single-input single-output (SISO), multiple-input multiple-output (MIMO), cooperative relaying, and multiuser/multicell interference networks. Considering both single-carrier and multi-carrier communication systems, we survey and categorize the timing and carrier synchronization techniques proposed for the different communication systems focusing on the system model assumptions for synchronization, the synchronization challenges, and the state-of-the-art synchronization solutions and their limitations. Finally, we envision some future research directions

    Implementação de um sistema de comunicações móveis para o Uplink

    Get PDF
    Mestrado em Engenharia Electrónica e TelecomunicaçõesÉ evidente que actualmente cada vez mais a internet móvel está presente na vida das sociedades. Hoje em dia é relativamente fácil estar ligado à internet sempre que se quiser, independentemente do lugar onde se encontra (conceito: anytime and anywhere). Desta forma existe um número crescente de utilizadores que acedem a serviços e aplicações interactivas a partir dos seus terminais móveis. Há, portanto, uma necessidade de adaptar o mundo das telecomunicações a esta nova realidade, para isso é necessário implementar novas arquitecturas que sejam capazes de fornecer maior largura de banda e reduzir os atrasos das comunicações, maximizando a utilização dos recursos disponíveis do meio/rede e melhorando assim a experiência do utilizador final. O LTE representa uma das tecnologias mais avançadas e de maior relevância para o acesso sem fios em banda larga de redes celulares. OFDM é a tecnologia base que está por traz da técnica de modulação, bem como as tecnologias adjacentes, OFDMA e SC-FDMA, usadas especificamente no LTE para a comunicação de dados descendente (downlink) ou ascendente (uplink), respectivamente. A implementação de múltiplas antenas em ambos os terminais, potenciam ainda mais o aumento da eficiência espectral do meio rádio permitindo atingir grandes taxas de transmissão de dados. Nesta dissertação é feito o estudo, implementação e avaliação do desempenho da camada física (camada 1 do modelo OSI) do LTE, no entanto o foco será a comunicação de dados ascendente e a respectiva técnica de modelação, SC-FDMA. Foi implementada uma plataforma de simulação baseada nas especificações do LTE UL onde foram considerandos diferentes esquemas de antenas. Particularmente para o esquema MIMO, usou-se a técnica de codificação no espaço-frequência proposta por Alamouti. Foram também implementados vários equalizadores. Os resultados provenientes da simulação demonstram tanto a eficiência dos diversos modos de operação em termos da taxa de erro, como o excelente funcionamento de processos de mapeamento e equalização, que visam melhorar a taxa de recepção de dados.It is clear that mobile Internet is present in the life of societies. Nowadays it is relatively easy to be connected to the internet whenever you want, no matter where you are (concept: anytime and anywhere). Thus, there are a growing number of users accessing interactive services and applications from their handsets. Therefore, there is a need to adapt the world of telecommunications to this new reality, for that it is necessary to implement new architectures that are able to provide higher bandwidth and reduce communication delays, maximizing use of available resources in the medium/network and thereby improving end-user experience. LTE represents one of the most advanced architectures and most relevant to wireless broadband cellular networks. OFDM is the technology that is behind the modulation technique and the underlying technologies, OFDMA and SCFDMA, used specifically in LTE for data communication downward (downlink) or upward (uplink), respectively. The implementation of multiple antennas at both ends further potentiate the increase of spectral efficiency allowing to achieve high rates of data transmission. In this dissertation is done the study, implementation and performance evaluation of the physical layer (OSI Layer 1) of the LTE, but the focus will be communication and its upstream data modeling technique, SC-FDMA. We implemented a simulation platform based on LTE UL specifications where were considered different antenna schemes. Particularly for the MIMO scheme, we used the technique of space-frequency coding proposed by Alamouti. We also implemented several equalizers. The results from the simulation demonstrate both the efficiency of different modes of operation in terms of error rate, as the excellent operation of mapping processes and equalization, designed to improve the rate of receiving data

    Peak-to-Average-Power-Ratio (PAPR) Reduction Techniques for Orthogonal-Frequency-Division- Multiplexing (OFDM) Transmission

    Get PDF
    Wireless communication has experienced an incredible growth in the last decade. Two decades ago,the number of mobile subscribers was less than 1% of the world\u27s population. As of 2011, the number of mobile subscribers has increased tremendously to 79.86% of the world\u27s population. Robust and high-rate data transmission in mobile environments faces severe problems due to the time-variant channel conditions, multipath fading and shadow fading. Fading is the main limitation on wireless communication channels. Frequency selective interference and fading, such as multipath fading, is a bandwidth bottleneck in the last mile which runs from the access point to the user. The last mile problem in wireless communication networks is caused by the environment of free space channels through which the signal propagates. Orthogonal Frequency Division Multiplexing (OFDM) is a promising modulation and multiplexing technique due to its robustness against multipath fading. Nevertheless, OFDM suffers from high Peak-to-Average- Power-Ratio (PAPR), which results in a complex OFDM signal. In this research, reduction of PAPR considering the out-of-band radiation and the regeneration of the time-domain signal peaks caused by filtering has been studied and is presented. Our PAPR reduction was 30% of the Discrete Fourier Transform (DFT) with Interleaved Frequency Division Multiple Access (IFDMA) utilizing Quadrature Phase Shift Keying (QPSK) and varying the roll-off factor. We show that pulse shaping does not affect the PAPR of Localized Frequency Division Multiple Access (LFDMA) as much as it affects the PAPR of IFDMA. Therefore, IFDMA has an important trade-off relationship between excess bandwidth and PAPR performance, since excess bandwidth increases as the roll-off factor increases. In addition, we studied a low complexity clipping scheme, applicable to IFDMA uplink and OFDM downlink systems for PAPR reduction. We show that the performance of the PAPR of the Interleaved-FDMA scheme is better than traditional OFDMA for the uplink transmission system. Our reduction of PAPR is 53% when IFDMA is used instead of OFDMA in the uplink direction. Furthermore, we also examined an important trade-off relationship between clipping distortion and quantization noise when the clipping scheme is used for OFDM downlink systems. Our results show a significant reduction in the PAPR and the out-of-band radiation caused by clipping for OFDM downlink transmission system
    corecore