122 research outputs found

    A Distributed Approach to Interference Alignment in OFDM-based Two-tiered Networks

    Full text link
    In this contribution, we consider a two-tiered network and focus on the coexistence between the two tiers at physical layer. We target our efforts on a long term evolution advanced (LTE-A) orthogonal frequency division multiple access (OFDMA) macro-cell sharing the spectrum with a randomly deployed second tier of small-cells. In such networks, high levels of co-channel interference between the macro and small base stations (MBS/SBS) may largely limit the potential spectral efficiency gains provided by the frequency reuse 1. To address this issue, we propose a novel cognitive interference alignment based scheme to protect the macro-cell from the cross-tier interference, while mitigating the co-tier interference in the second tier. Remarkably, only local channel state information (CSI) and autonomous operations are required in the second tier, resulting in a completely self-organizing approach for the SBSs. The optimal precoder that maximizes the spectral efficiency of the link between each SBS and its served user equipment is found by means of a distributed one-shot strategy. Numerical findings reveal non-negligible spectral efficiency enhancements with respect to traditional time division multiple access approaches at any signal to noise (SNR) regime. Additionally, the proposed technique exhibits significant robustness to channel estimation errors, achieving remarkable results for the imperfect CSI case and yielding consistent performance enhancements to the network.Comment: 15 pages, 10 figures, accepted and to appear in IEEE Transactions on Vehicular Technology Special Section: Self-Organizing Radio Networks, 2013. Authors' final version. Copyright transferred to IEE

    Novel feedback and signalling mechanisms for interference management and efficient modulation

    Get PDF
    In order to meet the ever-growing demand for mobile data, a number of different technologies have been adopted by the fourth generation standardization bodies. These include multiple access schemes such as spatial division multiple access (SDMA), and efficient modulation techniques such as orthogonal frequency division multiplexing (OFDM)-based modulation. The specific objectives of this theses are to develop an effective feedback method for interference management in smart antenna SDMA systems and to design an efficient OFDM-based modulation technique, where an additional dimension is added to the conventional two-dimensional modulation techniques such as quadrature amplitude modulation (QAM). In SDMA time division duplex (TDD) systems, where channel reciprocity is maintained, uplink (UL) channel sounding method is considered as one of the most promising feedback methods due to its bandwidth and delay efficiency. Conventional channel sounding (CCS) only conveys the channel state information (CSI) of each active user to the base station (BS). Due to the limitation in system performance because of co-channel interference (CCI) from adjacent cells in interference-limited scenarios, CSI is only a suboptimal metric for multiuser spatial multiplexing optimization. The first major contribution of this theses is a novel interference feedback method proposed to provide the BS with implicit knowledge about the interference level received by each mobile station (MS). More specifically, it is proposed to weight the conventional channel sounding pilots by the level of the experienced interference at the user’s side. Interference-weighted channel sounding (IWCS) acts as a spectrally efficient feedback technique that provides the BS with implicit knowledge about CCI experienced by each MS, and significantly improves the downlink (DL) sum capacity for both greedy and fair scheduling policies. For the sake of completeness, a novel procedure is developed to make the IWCS pilots usable for UL optimization. It is proposed to divide the optimization metric obtained from the IWCS pilots by the interference experienced at the BS’s antennas. The resultant new metric, the channel gain divided by the multiplication of DL and UL interference, provides link-protection awareness and is used to optimize both UL and DL. Using maximum capacity scheduling criterion, the link-protection aware metric results in a gain in the median system sum capacity of 26.7% and 12.5% in DL and UL respectively compared to the case when conventional channel sounding techniques are used. Moreover, heuristic algorithm has been proposed in order to facilitate a practical optimization and to reduce the computational complexity. The second major contribution of this theses is an innovative transmission approach, referred to as subcarrier-index modulation (SIM), which is proposed to be integrated with OFDM. The key idea of SIM is to employ the subcarrier-index to convey information to the receiver. Furthermore, a closed-form analytical bit error ratio (BER) of SIM OFDM in Rayleigh channel is derived. Simulation results show BER performance gain of 4 dB over 4-QAM OFDM for both coded and uncoded data without power saving policy. Alternatively, power saving policy maintains an average gain of 1 dB while only using half OFDM symbol transmit power

    Channel Estimation in Uplink of Long Term Evolution

    Get PDF
    Long Term Evolution is considered to be the fastest spreading communication standard in the world.To live up to the increasing demands of higher data rates day by day and higher multimedia services,the existing UMTS system was further upgraded to LTE.To meet their requirements novel technologies are employed in the downlink as well as uplink like Orthogonal Frequency Division Multiple Access (OFDMA) and Single Carrier- Frequency Division Multiple Access (SC-FDMA).For the receiver to perform properly it should be able to recover athe transmittedadata accurately and this is done through channel estimation.Channel Estimation in LTE engages Coherent Detection where a prior knowledge of the channel is required,often known as Channel State Information (CSI).This thesis aims at studying the channel estimation methods used in LTE and evaluate their performance in various multipath models specified by ITU like Pedestrian and Vehicular.The most commonly used channel estimation algorithms are Least Squarea(LS) and Minimum MeanaSquare error (MMSE) algorithms.The performance of these estimators are evaluated in both uplink as well as Downlink in terms of the Bit Error Rate (BER).It was evaluated for OFDMA and then for SC-FDMA,further the performance was assessed in SC-FDMA at first without subcarrier Mapping and after that with subcarrier mapping schemes like Interleaved SC-FDMA (IFDMA) and Localized SC-FDMA (lFDMA).It was found from the results that the MMSE estimator performs better than the LS estimator in both the environments.And the IFDMA has a lower PAPR than LFDMA but LFDMA has a better BER performance

    Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks

    Get PDF
    Soaring capacity and coverage demands dictate that future cellular networks need to soon migrate towards ultra-dense networks. However, network densification comes with a host of challenges that include compromised energy efficiency, complex interference management, cumbersome mobility management, burdensome signaling overheads and higher backhaul costs. Interestingly, most of the problems, that beleaguer network densification, stem from legacy networks' one common feature i.e., tight coupling between the control and data planes regardless of their degree of heterogeneity and cell density. Consequently, in wake of 5G, control and data planes separation architecture (SARC) has recently been conceived as a promising paradigm that has potential to address most of aforementioned challenges. In this article, we review various proposals that have been presented in literature so far to enable SARC. More specifically, we analyze how and to what degree various SARC proposals address the four main challenges in network densification namely: energy efficiency, system level capacity maximization, interference management and mobility management. We then focus on two salient features of future cellular networks that have not yet been adapted in legacy networks at wide scale and thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and device-to-device (D2D) communications. After providing necessary background on CoMP and D2D, we analyze how SARC can particularly act as a major enabler for CoMP and D2D in context of 5G. This article thus serves as both a tutorial as well as an up to date survey on SARC, CoMP and D2D. Most importantly, the article provides an extensive outlook of challenges and opportunities that lie at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201

    Efficient and Virtualized Scheduling for OFDM-Based High Mobility Wireless Communications Objects

    Get PDF
    Services providers (SPs) in the radio platform technology standard long term evolution (LTE) systems are enduring many challenges in order to accommodate the rapid expansion of mobile data usage. The modern technologies demonstrate new challenges to SPs, for example, reducing the cost of the capital and operating expenditures while supporting high data throughput per customer, extending battery life-per-charge of the cell phone devices, and supporting high mobility communications with fast and seamless handover (HO) networking architecture. In this thesis, a variety of optimized techniques aimed at providing innovative solutions for such challenges are explored. The thesis is divided into three parts. The first part outlines the benefits and challenges of deploying virtualized resource sharing concept. Wherein, SPs achieving a different schedulers policy are sharing evolved network B, allowing SPs to customize their efforts and provide service requirements; as a promising solution for reducing operational and capital expenditures, leading to potential energy savings, and supporting higher peak rates. The second part, formulates the optimized power allocation problem in a virtualized scheme in LTE uplink systems, aiming to extend the mobile devices’ battery utilization time per charge. While, the third part extrapolates a proposed hybrid-HO (HY-HO) technique, that can enhance the system performance in terms of latency and HO reliability at cell boundary for high mobility objects (up to 350 km/hr; wherein, HO will occur more frequent). The main contributions of this thesis are in designing optimal binary integer programmingbased and suboptimal heuristic (with complexity reduction) scheduling algorithms subject to exclusive and contiguous allocation, maximum transmission power, and rate constraints. Moreover, designing the HY-HO based on the combination of soft and hard HO was able to enhance the system performance in term of latency, interruption time and reliability during HO. The results prove that the proposed solutions effectively contribute in addressing the challenges caused by the demand for high data rates and power transmission in mobile networks especially in virtualized resources sharing scenarios that can support high data rates with improving quality of services (QoSs)

    Dynamic resource allocation algorithms for long term evolution (LTE) wireless broadband networks

    Get PDF
    Following the successful standardization of High-Speed Packet Access (HSPA), the 3rd Generation Partnership Project (3GPP) recently specified the Long Term Evolution (LTE) as a next generation radio network technology to meet the increasing performance requirements of mobile broadband. The results include a flexible and spectrally efficient radio link protocol design with low overhead. The first release of LTE provides peak rates of 300 Mbps in downlink and 75 Mbps in uplink. It is a significant increase in spectrum efficiency compared to the previous cellular systems. Single-Carrier Frequency Division Multiple Access (SC-FDMA) has been selected as the uplink access scheme in the LTE. With SC-FDMA, the frequency spectrum resource is divided into time-frequency grids, referred to as resource blocks (RBs). Multiple-access is achieved by distributing resource blocks to users. The function of resource block allocation algorithms is to distribute resource blocks among users in a fair and efficient manner. The Modulation and coding scheme is determined adaptively according to the time-varying channel conditions. Sounding Reference Signals (SRS) are transmitted in the uplink direction to allow for the base station to estimate the uplink channel quality at different frequencies. The LTE system supports wideband SRS and narrowband SRS. We have developed an in-house simulation program in C++ to evaluate and compare the performance of CASA and ICAS algorithms in terms of packet loss ratio, delay, and throughput. Simulation results show that the proposed algorithms are able to satisfy the QoS requirements. Both of proposed algorithms support multiple CoSs simultaneously without impeding the first class (Expedited Forwarding) transmission. Also, both of the proposed algorithms achieve high throughput in a large range cell

    Millimetre wave frequency band as a candidate spectrum for 5G network architecture : a survey

    Get PDF
    In order to meet the huge growth in global mobile data traffic in 2020 and beyond, the development of the 5th Generation (5G) system is required as the current 4G system is expected to fall short of the provision needed for such growth. 5G is anticipated to use a higher carrier frequency in the millimetre wave (mm-wave) band, within the 20 to 90 GHz, due to the availability of a vast amount of unexploited bandwidth. It is a revolutionary step to use these bands because of their different propagation characteristics, severe atmospheric attenuation, and hardware constraints. In this paper, we carry out a survey of 5G research contributions and proposed design architectures based on mm-wave communications. We present and discuss the use of mm-wave as indoor and outdoor mobile access, as a wireless backhaul solution, and as a key enabler for higher order sectorisation. Wireless standards such as IEE802.11ad, which are operating in mm-wave band have been presented. These standards have been designed for short range, ultra high data throughput systems in the 60 GHz band. Furthermore, this survey provides new insights regarding relevant and open issues in adopting mm-wave for 5G networks. This includes increased handoff rate and interference in Ultra-Dense Network (UDN), waveform consideration with higher spectral efficiency, and supporting spatial multiplexing in mm-wave line of sight. This survey also introduces a distributed base station architecture in mm-wave as an approach to address increased handoff rate in UDN, and to provide an alternative way for network densification in a time and cost effective manner

    A Review of MAC Scheduling Algorithms in LTE System

    Get PDF
    The recent wireless communication networks rely on the new technology named Long Term Evolution (LTE) to offer high data rate real-time (RT) traffic with better Quality of Service (QoS) for the increasing demand of customer requirement. LTE provide low latency for real-time services with high throughput, with the help of two-level packet retransmission. Hybrid Automatic Repeat Request (HARQ) retransmission at the Medium Access Control (MAC) layer of LTE networks achieves error-free data transmission. The performance of the LTE networks mainly depends on how effectively this HARQ adopted in the latest communication standard, Universal Mobile Telecommunication System (UMTS). The major challenge in LTE is to balance QoS and fairness among the users. Hence, it is very essential to design a down link scheduling scheme to get the expected service quality to the customers and to utilize the system resources efficiently. This paper provides a comprehensive literature review of LTE MAC layer and six types of QoS/Channel-aware downlink scheduling algorithms designed for this purpose. The contributions of this paper are to identify the gap of knowledge in the downlink scheduling procedure and to point out the future research direction. Based on the comparative study of algorithms taken for the review, this paper is concluded that the EXP Rule scheduler is most suited for LTE networks due to its characteristics of less Packet Loss Ratio (PLR), less Packet Delay (PD), high throughput, fairness and spectral efficiency
    corecore