6 research outputs found

    Correlation-aware Resource Allocation in Multi-Cell Networks

    Full text link
    We propose a cross-layer strategy for resource allocation between spatially correlated sources in the uplink of multi-cell FDMA networks. Our objective is to find the optimum power and channel to sources, in order to minimize the maximum distortion achieved by any source in the network. Given that the network is multi-cell, the inter-cell interference must also be taken into consideration. This resource allocation problem is NP-hard and the optimal solution can only be found by exhaustive search over the entire solution space, which is not computationally feasible. We propose a three step method to be performed separately by the scheduler in each cell, which finds cross-layer resource allocation in simple steps. The three- step algorithm separates the problem into inter-cell resource management, grouping of sources for joint decoding, and intra- cell channel assignment. For each of the steps we propose allocation methods that satisfy different design constraints. In the simulations we compare methods for each step of the algorithm. We also demonstrate the overall gain of using correlation-aware resource allocation for a typical multi-cell network of Gaussian sources. We show that, while using correlation in compression and joint decoding can achieve 25% loss in distortion over independent decoding, this loss can be increased to 37% when correlation is also utilized in resource allocation method. This significant distortion loss motivates further work in correlation-aware resource allocation. Overall, we find that our method achieves a 60% decrease in 5 percentile distortion compared to independent methods

    Exploiting the ability of Self Organizing Networks for inter-cell interference coordination for emergency communications in cellular networks

    Get PDF
    Title from PDF of title page, viewed on June 15, 2015Thesis advisor: Cory BeardVitaIncludes bibliographic references (pages 56-57)Thesis (M.S.)--School of Computing and Engineering. University of Missouri--Kansas City, 2014In the current scenario, radio planning of wireless cellular networks and analysis of radio performance should be agile because it is expected that in the near future we will be reaching to the point where there will be as many mobile devices as people in the world. So, there should be a rapid revolution in technology which can aid in the management of resources and maximization of throughput to satisfy users effectively. LTE and LTE-Advanced is designed to meet high bit rate service requirements; however, the initial challenge of the wireless channel, such as limited spectrum, leads to frequency reuse but also irrevocable interference. This thesis gives a holistic conspectus of interference coordination in LTE cellular systems utilizing the ability of Self Organizing Networks (SON). LTE uses a universal frequency reuse concept and the only interference observed in LTE is inter-cell interference. In a network where users are randomly distributed over three cells, it manages resources between the base stations by restricting some resource blocks for Cell Edge Users (CEU) of the neighboring cell and other resource blocks for Cell Center Users (CCU). This is done in a semi-static approach by taking into account the location of the user and varying channel conditions. Cell edge users and cell center users are distinguished based upon the SINR level. The management of the resources are regulated as per the user requirements and coordinated by the neighboring cells. The results have been simulated in two different ambiances viz., normal traffic and the emergency condition to show its performance in exigency. The throughput of the CCUs and CEUs in normal traffic has been compared. Also, the approach and results are shown to be highly reliable.Introduction -- Background -- Our work -- MATLAB code implementation -- Results and analysis -- Conclusion and future scop

    Radio Resource Virtualization in Cellular Networks

    Get PDF
    Virtualization of wireless networks holds the promise of major gains in resource usage efficiency through spectrum/radio resources sharing between multiple service providers (SPs). Radio resources however are not like a simple orthogonal resource such as time slots on a wire and its shared quantity is a function of geography and signal strength, rather than orthogonal slices. To better exploit the radio resource usage, we propose a novel scheme - radio resource virtualization (RRV) that allows SPs to access overlapping spectrum slices both in time and in space considering the transmit power, the interference, and the usage scenario (capabilities/needs of devices). We first investigate the system capacity of a simple two-cell network and show that RRV often leads to better efficiency than the well-known separate spectrum virtualization (SSV) scheme. However, the use of RRV requires careful air-interface configuration due to interference in the overlapping slices of spectrum. Therefore we next examine scenarios of a multi-cell network with fractional frequency reuse (FFR) implementing five radio resources configuration cases. From the evaluation of capacity data obtained from simulations, a variety of tradeoffs exist between SPs if RRV is applied. One example shows that capacity of the SP that operates smaller cells almost doubles while capacity of the SP deployed in larger cells may drop by 20% per subscriber. Based on these tradeoffs, we suggest configuration maps in which a network resource manager can locate specific configurations according to the demand and capabilities of SPs and their subscribers. Finally, we consider a case study on top of LTE. A system-level simulator is developed following 3GPP standards and extensive simulations are conducted. We propose and test 3 schemes that integrate RRV into the LTE radio resource management (RRM) -- unconditional RRV, time domain muting (TDM) RRV and major-interferer time domain muting (MI-TDM) RRV. Along the same line as the capacity analysis, we compare those schemes with the traditional SSV and suggest configuration maps based on the produced tradeoffs. Our investigation of RRV provides a framework that evaluates the resource efficiency, and potentially the ability of customization and isolation of spectrum sharing in virtualized cellular networks

    A Study about Heterogeneous Network Issues Management based on Enhanced Inter-cell Interference Coordination and Machine Learning Algorithms

    Get PDF
    Under the circumstance of fast growing demands for mobile data, Heterogeneous Networks (HetNets) has been considered as one of the key technologies to solve 1000 times mobile data challenge in the coming decade. Although the unique multi-tier topology of HetNets has achieved high spectrum efficiency and enhanced Quality of Service (QoS), it also brings a series of critical issues. In this thesis, we present an investigation on understanding the cause of HetNets challenges and provide a research on state of arts techniques to solve three major issues: interference, offloading and handover. The first issue addressed in the thesis is the cross-tier interference of HetNets. We introduce Almost Blank Subframes (ABS) to free small cell UEs from cross-tier interference, which is the key technique of enhanced Inter-Cell Interference Coordination (eICIC). Nash Bargain Solution (NBS) is applied to optimize ABS ratio and UE partition. Furthermore, we propose a power based multi-layer NBS Algorithm to obtain optimal parameters of Further enhanced Inter-cell Interference Coordination (FeICIC), which significantly improve macrocell efficiency compared to eICIC. This algorithm not only introduces dynamic power ratio but also defined opportunity cost for each layer instead of conventional zero-cost partial fairness. Simulation results show the performance of proposed algorithm may achieve up to 31.4% user throughput gain compared to eICIC and fixed power ratio FeICIC. This thesis’ second focusing issue is offloading problem of HetNets. This includes (1) UE offloading from macro cell and (2) small cell backhaul offloading. For first aspect, we have discussed the capability of machine learning algorithms tackling this challenge and propose the User-Based K-means Algorithm (UBKCA). The proposed algorithm establishes a closed loop Self-Organization system on our HetNets scenario to maintain desired offloading factor of 50%, with cell edge user factor 17.5% and CRE bias of 8dB. For second part, we further apply machine learning clustering method to establish cache system, which may achieve up to 70.27% hit-ratio and reduce request latency by 60.21% for Youtube scenario. K-Nearest Neighbouring (KNN) is then applied to predict new users’ content preference and prove our cache system’s suitability. Besides that, we have also proposed a system to predict users’ content preference even if the collected data is not complete. The third part focuses on offloading phase within HetNets. This part detailed discusses CRE’s positive effect on mitigating ping-pong handover during UE offloading, and CRE’s negative effect on increasing cross-tier interference. And then a modified Markov Chain Process is established to map the handover phases for UE to offload from macro cell to small cell and vice versa. The transition probability of MCP has considered both effects of CRE so that the optimal CRE value for HetNets can be achieved, and result for our scenario is 7dB. The combination of CRE and Handover Margin is also discussed
    corecore