3 research outputs found

    Coverage Enhancement of PBCH using Reduced Search Viterbi for MTC in LTE-Advanced Networks

    Get PDF
    Abstract-Machine Type Communication (MTC) is becoming an integral part of the Long Term Evolution -Advanced (LTE-A) cellular network. Challenges arise when some of the MTC devices, due to the nature of their applications, are deployed in low signal locations. As per 3GPP requirements, there is a need for additional coverage enhancement up to 20 dB in comparison with LTE category 1 UE for MTC devices. In the previous works reported till now, Repetition Coding is proposed as an effective technique to achieve the required coverage enhancements at cost of longer decoding time. In low signal conditions where many repetitions are required to build the SNR needed, the decoding delay may be unacceptable. For a LTE-A MTC UE, Physical Broadcast CHannel (PBCH) decoding has a very important role and fast, efficient decoding of PBCH will help to improve the device performance. In this paper, we propose to use well established technique called Reduced Search (RS) Viterbi to improve PBCH decoding performance without compromising the time-to-decode. RS Viterbi technique utilizes a priori knowledge of transmitted bits to reduce the size and complexity of trellis, which in turn also reduces probability of choosing incorrect path, i.e., error. Up to 2.2 dB SNR gain is seen in simulation using the RS Viterbi decoding against the conventional Viterbi decoding, which will contribute in improving the sensitivity of MTC devices for better reachability

    Enabling Technologies for Internet of Things: Licensed and Unlicensed Techniques

    Get PDF
    The Internet of Things (IoT) is a novel paradigm which is shaping the evolution of the future Internet. According to the vision underlying the IoT, the next step in increasing the ubiquity of the Internet, after connecting people anytime and everywhere, is to connect inanimate objects. By providing objects with embedded communication capabilities and a common addressing scheme, a highly distributed and ubiquitous network of seamlessly connected heterogeneous devices is formed, which can be fully integrated into the current Internet and mobile networks, thus allowing for the development of new intelligent services available anytime, anywhere, by anyone and anything. Such a vision is also becoming known under the name of Machine-to-Machine (M2M), where the absence of human interaction in the system dynamics is further emphasized. A massive number of wireless devices will have the ability to connect to the Internat through the IoT framework. With the accelerating pace of marketing such framework, the new wireless communications standards are studying/proposing solutions to incorporate the services needed for the IoT. However, with an estimate of 30 billion connected devices, a lot of challenges are facing the current wireless technology. In our research, we address a variety of technology candidates for enabling such a massive framework. Mainly, we focus on the nderlay cognitive radio networks as the unlicensed candidate for IoT. On the other hand, we look into the current efforts done by the standardization bodies to accommodate the requirements of the IoT into the current cellular networks. Specifically, we survey the new features and the new user equipment categories added to the physical layer of the LTE-A. In particular, we study the performance of a dual-hop cognitive radio network sharing the spectrum of a primary network in an underlay fashion. In particular, the cognitive network consists of a source, a destination, and multiple nodes employed as amplify-and-forward relays. To improve the spectral efficiency, all relays are allowed to instantaneously transmit to the destination over the same frequency band. We present the optimal power allocation that maximizes the received signal-to-noise ratio (SNR) at the destination while satisfying the interference constrains of the primary network. The optimal power allocation is obtained through an eigen-solution of a channel-dependent matrix, and is shown to transform the transmission over the non-orthogonal relays into parallel channels. Furthermore, while the secondary destination is equipped with multiple antennas, we propose an antenna selection scheme to select the antenna with the highest SNR. To this end, we propose a clustering scheme to subgroup the available relays and use antenna selection at the receiver to extract the same diversity order. We show that random clustering causes the system to lose some of the available degrees of freedom. We provide analytical expression of the outage probability of the system for the random clustering and the proposed maximum-SNR clustering scheme with antenna selection. In addition, we adapt our design to increase the energy-efficiency of the overall network without significant loss in the data rate. In the second part of this thesis, we will look into the current efforts done by the standardization bodies to accommodate the equirements of the IoT into the current cellular networks. Specifically, we present the new features and the new user equipment categories added to the physical layer of the LTE-A. We study some of the challenges facing the LTE-A when dealing with Machine Type communications (MTC). Specifically, the MTC Physical Downlink control channel (MPDCCH) is among the newly introduced features in the LTE-A that carries the downlink control information (DCI) for MTC devices. Correctly decoding the PDCCH, mainly depends on the channel estimation used to compensate for the channel errors during transmission, and the choice of such technique will affect both the complexity and the performance of the user equipment. We propose and assess the performance of a simple channel estimation technique depends in essence on the Least Squares (LS) estimates of the pilots signal and linear interpolations for low-Doppler channels associated with the MTC application

    CELLULAR-ENABLED MACHINE TYPE COMMUNICATIONS: RECENT TECHNOLOGIES AND COGNITIVE RADIO APPROACHES

    Get PDF
    The scarcity of bandwidth has always been the main obstacle for providing reliable high data-rate wireless links, which are in great demand to accommodate nowadays and immediate future wireless applications. In addition, recent reports have showed inefficient usage and under-utilization of the available bandwidth. Cognitive radio (CR) has recently emerged as a promising solution to enhance the spectrum utilization, where it offers the ability for unlicensed users to access the licensed spectrum opportunistically. By allowing opportunistic spectrum access which is the main concept for the interweave network model, the overall spectrum utilization can be improved. This requires cognitive radio networks (CRNs) to consider the spectrum sensing and monitoring as an essential enabling process for the interweave network model. Machine-to-machine (M2M) communication, which is the basic enabler for the Internet-of-Things (IoT), has emerged to be a key element in future networks. Machines are expected to communicate with each other exchanging information and data without human intervention. The ultimate objective of M2M communications is to construct comprehensive connections among all machines distributed over an extensive coverage area. Due to the radical change in the number of users, the network has to carefully utilize the available resources in order to maintain reasonable quality-of-service (QoS). Generally, one of the most important resources in wireless communications is the frequency spectrum. To utilize the frequency spectrum in IoT environment, it can be argued that cognitive radio concept is a possible solution from the cost and performance perspectives. Thus, supporting numerous number of machines is possible by employing dual-mode base stations which can apply cognitive radio concept in addition to the legacy licensed frequency assignment. In this thesis, a detailed review of the state of the art related to the application of spectrum sensing in CR communications is considered. We present the latest advances related to the implementation of the legacy spectrum sensing approaches. We also address the implementation challenges for cognitive radios in the direction of spectrum sensing and monitoring. We propose a novel algorithm to solve the reduced throughput issue due to the scheduled spectrum sensing and monitoring. Further, two new architectures are considered to significantly reduce the power consumption required by the CR to enable wideband sensing. Both systems rely on the 1-bit quantization at the receiver side. The system performance is analytically investigated and simulated. Also, complexity and power consumption are investigated and studied. Furthermore, we address the challenges that are expected from the next generation M2M network as an integral part of the future IoT. This mainly includes the design of low-power low-cost machine with reduced bandwidth. The trade-off between cost, feasibility, and performance are also discussed. Because of the relaxation of the frequency and spatial diversities, in addition, to enabling the extended coverage mode, initial synchronization and cell search have new challenges for cellular-enabled M2M systems. We study conventional solutions with their pros and cons including timing acquisition, cell detection, and frequency offset estimation algorithms. We provide a technique to enhance the performance in the presence of the harsh detection environment for LTE-based machines. Furthermore, we present a frequency tracking algorithm for cellular M2M systems that utilizes the new repetitive feature of the broadcast channel symbols in next generation Long Term Evolution (LTE) systems. In the direction of narrowband IoT support, we propose a cell search and initial synchronization algorithm that utilizes the new set of narrowband synchronization signals. The proposed algorithms have been simulated at very low signal to noise ratios and in different fading environments
    corecore