1,226 research outputs found

    A robust machine learning method for cell-load approximation in wireless networks

    Full text link
    We propose a learning algorithm for cell-load approximation in wireless networks. The proposed algorithm is robust in the sense that it is designed to cope with the uncertainty arising from a small number of training samples. This scenario is highly relevant in wireless networks where training has to be performed on short time scales because of a fast time-varying communication environment. The first part of this work studies the set of feasible rates and shows that this set is compact. We then prove that the mapping relating a feasible rate vector to the unique fixed point of the non-linear cell-load mapping is monotone and uniformly continuous. Utilizing these properties, we apply an approximation framework that achieves the best worst-case performance. Furthermore, the approximation preserves the monotonicity and continuity properties. Simulations show that the proposed method exhibits better robustness and accuracy for small training sets in comparison with standard approximation techniques for multivariate data.Comment: Shorter version accepted at ICASSP 201

    Separation Framework: An Enabler for Cooperative and D2D Communication for Future 5G Networks

    Get PDF
    Soaring capacity and coverage demands dictate that future cellular networks need to soon migrate towards ultra-dense networks. However, network densification comes with a host of challenges that include compromised energy efficiency, complex interference management, cumbersome mobility management, burdensome signaling overheads and higher backhaul costs. Interestingly, most of the problems, that beleaguer network densification, stem from legacy networks' one common feature i.e., tight coupling between the control and data planes regardless of their degree of heterogeneity and cell density. Consequently, in wake of 5G, control and data planes separation architecture (SARC) has recently been conceived as a promising paradigm that has potential to address most of aforementioned challenges. In this article, we review various proposals that have been presented in literature so far to enable SARC. More specifically, we analyze how and to what degree various SARC proposals address the four main challenges in network densification namely: energy efficiency, system level capacity maximization, interference management and mobility management. We then focus on two salient features of future cellular networks that have not yet been adapted in legacy networks at wide scale and thus remain a hallmark of 5G, i.e., coordinated multipoint (CoMP), and device-to-device (D2D) communications. After providing necessary background on CoMP and D2D, we analyze how SARC can particularly act as a major enabler for CoMP and D2D in context of 5G. This article thus serves as both a tutorial as well as an up to date survey on SARC, CoMP and D2D. Most importantly, the article provides an extensive outlook of challenges and opportunities that lie at the crossroads of these three mutually entangled emerging technologies.Comment: 28 pages, 11 figures, IEEE Communications Surveys & Tutorials 201

    Traffic-Driven Spectrum Allocation in Heterogeneous Networks

    Full text link
    Next generation cellular networks will be heterogeneous with dense deployment of small cells in order to deliver high data rate per unit area. Traffic variations are more pronounced in a small cell, which in turn lead to more dynamic interference to other cells. It is crucial to adapt radio resource management to traffic conditions in such a heterogeneous network (HetNet). This paper studies the optimization of spectrum allocation in HetNets on a relatively slow timescale based on average traffic and channel conditions (typically over seconds or minutes). Specifically, in a cluster with nn base transceiver stations (BTSs), the optimal partition of the spectrum into 2n2^n segments is determined, corresponding to all possible spectrum reuse patterns in the downlink. Each BTS's traffic is modeled using a queue with Poisson arrivals, the service rate of which is a linear function of the combined bandwidth of all assigned spectrum segments. With the system average packet sojourn time as the objective, a convex optimization problem is first formulated, where it is shown that the optimal allocation divides the spectrum into at most nn segments. A second, refined model is then proposed to address queue interactions due to interference, where the corresponding optimal allocation problem admits an efficient suboptimal solution. Both allocation schemes attain the entire throughput region of a given network. Simulation results show the two schemes perform similarly in the heavy-traffic regime, in which case they significantly outperform both the orthogonal allocation and the full-frequency-reuse allocation. The refined allocation shows the best performance under all traffic conditions.Comment: 13 pages, 11 figures, accepted for publication by JSAC-HC

    Traffic Driven Resource Allocation in Heterogenous Wireless Networks

    Full text link
    Most work on wireless network resource allocation use physical layer performance such as sum rate and outage probability as the figure of merit. These metrics may not reflect the true user QoS in future heterogenous networks (HetNets) with many small cells, due to large traffic variations in overlapping cells with complicated interference conditions. This paper studies the spectrum allocation problem in HetNets using the average packet sojourn time as the performance metric. To be specific, in a HetNet with KK base terminal stations (BTS's), we determine the optimal partition of the spectrum into 2K2^K possible spectrum sharing combinations. We use an interactive queueing model to characterize the flow level performance, where the service rates are decided by the spectrum partition. The spectrum allocation problem is formulated using a conservative approximation, which makes the optimization problem convex. We prove that in the optimal solution the spectrum is divided into at most KK pieces. A numerical algorithm is provided to solve the spectrum allocation problem on a slow timescale with aggregate traffic and service information. Simulation results show that the proposed solution achieves significant gains compared to both orthogonal and full spectrum reuse allocations with moderate to heavy traffic.Comment: 6 pages, 5 figures IEEE GLOBECOM 2014 (accepted for publication

    The role of asymptotic functions in network optimization and feasibility studies

    Full text link
    Solutions to network optimization problems have greatly benefited from developments in nonlinear analysis, and, in particular, from developments in convex optimization. A key concept that has made convex and nonconvex analysis an important tool in science and engineering is the notion of asymptotic function, which is often hidden in many influential studies on nonlinear analysis and related fields. Therefore, we can also expect that asymptotic functions are deeply connected to many results in the wireless domain, even though they are rarely mentioned in the wireless literature. In this study, we show connections of this type. By doing so, we explain many properties of centralized and distributed solutions to wireless resource allocation problems within a unified framework, and we also generalize and unify existing approaches to feasibility analysis of network designs. In particular, we show sufficient and necessary conditions for mappings widely used in wireless communication problems (more precisely, the class of standard interference mappings) to have a fixed point. Furthermore, we derive fundamental bounds on the utility and the energy efficiency that can be achieved by solving a large family of max-min utility optimization problems in wireless networks.Comment: GlobalSIP 2017 (to appear

    A survey of self organisation in future cellular networks

    Get PDF
    This article surveys the literature over the period of the last decade on the emerging field of self organisation as applied to wireless cellular communication networks. Self organisation has been extensively studied and applied in adhoc networks, wireless sensor networks and autonomic computer networks; however in the context of wireless cellular networks, this is the first attempt to put in perspective the various efforts in form of a tutorial/survey. We provide a comprehensive survey of the existing literature, projects and standards in self organising cellular networks. Additionally, we also aim to present a clear understanding of this active research area, identifying a clear taxonomy and guidelines for design of self organising mechanisms. We compare strength and weakness of existing solutions and highlight the key research areas for further development. This paper serves as a guide and a starting point for anyone willing to delve into research on self organisation in wireless cellular communication networks
    corecore