1,021 research outputs found

    Leveraging intelligence from network CDR data for interference aware energy consumption minimization

    Get PDF
    Cell densification is being perceived as the panacea for the imminent capacity crunch. However, high aggregated energy consumption and increased inter-cell interference (ICI) caused by densification, remain the two long-standing problems. We propose a novel network orchestration solution for simultaneously minimizing energy consumption and ICI in ultra-dense 5G networks. The proposed solution builds on a big data analysis of over 10 million CDRs from a real network that shows there exists strong spatio-temporal predictability in real network traffic patterns. Leveraging this we develop a novel scheme to pro-actively schedule radio resources and small cell sleep cycles yielding substantial energy savings and reduced ICI, without compromising the users QoS. This scheme is derived by formulating a joint Energy Consumption and ICI minimization problem and solving it through a combination of linear binary integer programming, and progressive analysis based heuristic algorithm. Evaluations using: 1) a HetNet deployment designed for Milan city where big data analytics are used on real CDRs data from the Telecom Italia network to model traffic patterns, 2) NS-3 based Monte-Carlo simulations with synthetic Poisson traffic show that, compared to full frequency reuse and always on approach, in best case, proposed scheme can reduce energy consumption in HetNets to 1/8th while providing same or better Qo

    Energy-efficiency for MISO-OFDMA based user-relay assisted cellular networks

    Get PDF
    The concept of improving energy-efficiency (EE) without sacrificing the service quality has become important nowadays. The combination of orthogonal frequency-division multiple-access (OFDMA) multi-antenna transmission technology and relaying is one of the key technologies to deliver the promise of reliable and high-data-rate coverage in the most cost-effective manner. In this paper, EE is studied for the downlink multiple-input single-output (MISO)-OFDMA based user-relay assisted cellular networks. EE maximization is formulated for decode and forward (DF) relaying scheme with the consideration of both transmit and circuit power consumption as well as the data rate requirements for the mobile users. The quality of-service (QoS)-constrained EE maximization, which is defined for multi-carrier, multi-user, multi-relay and multi-antenna networks, is a non-convex and combinatorial problem so it is hard to tackle. To solve this difficult problem, a radio resource management (RRM) algorithm that solves the subcarrier allocation, mode selection and power allocation separately is proposed. The efficiency of the proposed algorithm is demonstrated by numerical results for different system parameter

    Energy-Efficient Heterogeneous Cellular Networks with Spectrum Underlay and Overlay Access

    Full text link
    In this paper, we provide joint subcarrier assignment and power allocation schemes for quality-of-service (QoS)-constrained energy-efficiency (EE) optimization in the downlink of an orthogonal frequency division multiple access (OFDMA)-based two-tier heterogeneous cellular network (HCN). Considering underlay transmission, where spectrum-efficiency (SE) is fully exploited, the EE solution involves tackling a complex mixed-combinatorial and non-convex optimization problem. With appropriate decomposition of the original problem and leveraging on the quasi-concavity of the EE function, we propose a dual-layer resource allocation approach and provide a complete solution using difference-of-two-concave-functions approximation, successive convex approximation, and gradient-search methods. On the other hand, the inherent inter-tier interference from spectrum underlay access may degrade EE particularly under dense small-cell deployment and large bandwidth utilization. We therefore develop a novel resource allocation approach based on the concepts of spectrum overlay access and resource efficiency (RE) (normalized EE-SE trade-off). Specifically, the optimization procedure is separated in this case such that the macro-cell optimal RE and corresponding bandwidth is first determined, then the EE of small-cells utilizing the remaining spectrum is maximized. Simulation results confirm the theoretical findings and demonstrate that the proposed resource allocation schemes can approach the optimal EE with each strategy being superior under certain system settings

    Interference-Aware Downlink Resource Management for OFDMA Femtocell Networks

    Get PDF
    Femtocell is an economical solution to provide high speed indoor communication instead of the conventional macro-cellular networks. Especially, OFDMA femtocell is considered in the next generation cellular network such as 3GPP LTE and mobile WiMAX system. Although the femtocell has great advantages to accommodate indoor users, interference management problem is a critical issue to operate femtocell network. Existing OFDMA resource management algorithms only consider optimizing system-centric metric, and cannot manage the co-channel interference. Moreover, it is hard to cooperate with other femtocells to control the interference, since the self-configurable characteristics of femtocell. This paper proposes a novel interference-aware resource allocation algorithm for OFDMA femtocell networks. The proposed algorithm allocates resources according to a new objective function which reflects the effect of interference, and the heuristic algorithm is also introduced to reduce the complexity of the original problem. The Monte-Carlo simulation is performed to evaluate the performance of the proposed algorithm compared to the existing solutions

    A Survey on Delay-Aware Resource Control for Wireless Systems --- Large Deviation Theory, Stochastic Lyapunov Drift and Distributed Stochastic Learning

    Full text link
    In this tutorial paper, a comprehensive survey is given on several major systematic approaches in dealing with delay-aware control problems, namely the equivalent rate constraint approach, the Lyapunov stability drift approach and the approximate Markov Decision Process (MDP) approach using stochastic learning. These approaches essentially embrace most of the existing literature regarding delay-aware resource control in wireless systems. They have their relative pros and cons in terms of performance, complexity and implementation issues. For each of the approaches, the problem setup, the general solution and the design methodology are discussed. Applications of these approaches to delay-aware resource allocation are illustrated with examples in single-hop wireless networks. Furthermore, recent results regarding delay-aware multi-hop routing designs in general multi-hop networks are elaborated. Finally, the delay performance of the various approaches are compared through simulations using an example of the uplink OFDMA systems.Comment: 58 pages, 8 figures; IEEE Transactions on Information Theory, 201

    Scheduling for Multi-Camera Surveillance in LTE Networks

    Full text link
    Wireless surveillance in cellular networks has become increasingly important, while commercial LTE surveillance cameras are also available nowadays. Nevertheless, most scheduling algorithms in the literature are throughput, fairness, or profit-based approaches, which are not suitable for wireless surveillance. In this paper, therefore, we explore the resource allocation problem for a multi-camera surveillance system in 3GPP Long Term Evolution (LTE) uplink (UL) networks. We minimize the number of allocated resource blocks (RBs) while guaranteeing the coverage requirement for surveillance systems in LTE UL networks. Specifically, we formulate the Camera Set Resource Allocation Problem (CSRAP) and prove that the problem is NP-Hard. We then propose an Integer Linear Programming formulation for general cases to find the optimal solution. Moreover, we present a baseline algorithm and devise an approximation algorithm to solve the problem. Simulation results based on a real surveillance map and synthetic datasets manifest that the number of allocated RBs can be effectively reduced compared to the existing approach for LTE networks.Comment: 9 pages, 10 figure
    corecore