41 research outputs found

    Participatory sensing as an enabler for self-organisation in future cellular networks

    Get PDF
    In this short review paper we summarise the emerging challenges in the field of participatory sensing for the self-organisation of the next generation of wireless cellular networks. We identify the potential of participatory sensing in enabling the self-organisation, deployment optimisation and radio resource management of wireless cellular networks. We also highlight how this approach can meet the future goals for the next generation of cellular system in terms of infrastructure sharing, management of multiple radio access techniques, flexible usage of spectrum and efficient management of very small data cells

    Joint Dynamic Radio Resource Allocation and Mobility Load Balancing in 3GPP LTE Multi-Cell Network

    Get PDF
    Load imbalance, together with inefficient utilization of system resource, constitute major factors responsible for poor overall performance in Long Term Evolution (LTE) network. In this paper, a novel scheme of joint dynamic resource allocation and load balancing is proposed to achieve a balanced performance improvement in 3rd Generation Partnership Project (3GPP) LTE Self-Organizing Networks (SON). The new method which aims at maximizing network resource efficiency subject to inter-cell interference and intra-cell resource constraints is implemented in two steps. In the first step, an efficient resource allocation, including user scheduling and power assignment, is conducted in a distributed manner to serve as many users in the whole network as possible. In the second step, based on the resource allocation scheme, the optimization objective namely network resource efficiency can be calculated and load balancing is implemented by switching the user that can maximize the objective function. Lagrange Multipliers method and heuristic algorithm are used to resolve the formulated optimization problem. Simulation results show that our algorithm achieves better performance in terms of user throughput, fairness, load balancing index and unsatisfied user number compared with the traditional approach which takes resource allocation and load balancing into account, respectively

    Memory-full context-aware predictive mobility management in dual connectivity 5G networks

    Get PDF
    Network densification with small cell deployment is being considered as one of the dominant themes in the fifth generation (5G) cellular system. Despite the capacity gains, such deployment scenarios raise several challenges from mobility management perspective. The small cell size, which implies a small cell residence time, will increase the handover (HO) rate dramatically. Consequently, the HO latency will become a critical consideration in the 5G era. The latter requires an intelligent, fast and light-weight HO procedure with minimal signalling overhead. In this direction, we propose a memory-full context-aware HO scheme with mobility prediction to achieve the aforementioned objectives. We consider a dual connectivity radio access network architecture with logical separation between control and data planes because it offers relaxed constraints in implementing the predictive approaches. The proposed scheme predicts future HO events along with the expected HO time by combining radio frequency performance to physical proximity along with the user context in terms of speed, direction and HO history. To minimise the processing and the storage requirements whilst improving the prediction performance, a user-specific prediction triggering threshold is proposed. The prediction outcome is utilised to perform advance HO signalling whilst suspending the periodic transmission of measurement reports. Analytical and simulation results show that the proposed scheme provides promising gains over the conventional approach

    An Energy-Aware Protocol for Self-Organizing Heterogeneous LTE Systems

    Get PDF
    This paper studies the problem of self-organizing heterogeneous LTE systems. We propose a model that jointly considers several important characteristics of heterogeneous LTE system, including the usage of orthogonal frequency division multiple access (OFDMA), the frequency-selective fading for each link, the interference among different links, and the different transmission capabilities of different types of base stations. We also consider the cost of energy by taking into account the power consumption, including that for wireless transmission and that for operation, of base stations and the price of energy. Based on this model, we aim to propose a distributed protocol that improves the spectrum efficiency of the system, which is measured in terms of the weighted proportional fairness among the throughputs of clients, and reduces the cost of energy. We identify that there are several important components involved in this problem. We propose distributed strategies for each of these components. Each of the proposed strategies requires small computational and communicational overheads. Moreover, the interactions between components are also considered in the proposed strategies. Hence, these strategies result in a solution that jointly considers all factors of heterogeneous LTE systems. Simulation results also show that our proposed strategies achieve much better performance than existing ones

    Big data assisted CRAN enabled 5G SON architecture

    Get PDF
    The recent development of Big Data, Internet of Things (IoT) and 5G network technology offers a plethora of opportunities to the IT industry and mobile network operators. 5G cellular technology promises to offer connectivity to massive numbers of IoT devices while meeting low-latency data transmission requirements. A deficiency of the current 4G networks is that the data from IoT devices and mobile nodes are merely passed on to the cloud and the communication infrastructure does not play a part in data analysis. Instead of only passing data on to the cloud, the system could also contribute to data analysis and decision-making. In this work, a Big Data driven self-optimized 5G network design is proposed using the knowledge of emerging technologies CRAN, NVF and SDN. Also, some technical impediments in 5G network optimization are discussed. A case study is presented to demonstrate the assistance of Big Data in solving the resource allocation problem

    Self-organizing inter-cell interference coordination in 4G and beyond networks using genetic algorithms

    Get PDF
    The design objective of the 4G and beyond networks is not only to provide high data rate services but also ensure a good subscriber experience in terms of quality of service. However, the main challenge to this objective is the growing size and heterogeneity of these networks. This paper proposes a genetic-algorithm-based approach for the self-optimization of interference mitigation parameters for downlink inter-cell interference coordination parameter in Long Term Evolution (LTE) networks. The proposed algorithm is generic in nature and operates in an environment with the variations in traffic, user positions and propagation conditions. A comprehensive analysis of the obtained simulation results is presented, which shows that the proposed approach can significantly improve the network coverage in terms of call accept rate as well as capacity in terms of throughput

    Joint Beamforming and Power Control in Coordinated Multicell: Max-Min Duality, Effective Network and Large System Transition

    Full text link
    This paper studies joint beamforming and power control in a coordinated multicell downlink system that serves multiple users per cell to maximize the minimum weighted signal-to-interference-plus-noise ratio. The optimal solution and distributed algorithm with geometrically fast convergence rate are derived by employing the nonlinear Perron-Frobenius theory and the multicell network duality. The iterative algorithm, though operating in a distributed manner, still requires instantaneous power update within the coordinated cluster through the backhaul. The backhaul information exchange and message passing may become prohibitive with increasing number of transmit antennas and increasing number of users. In order to derive asymptotically optimal solution, random matrix theory is leveraged to design a distributed algorithm that only requires statistical information. The advantage of our approach is that there is no instantaneous power update through backhaul. Moreover, by using nonlinear Perron-Frobenius theory and random matrix theory, an effective primal network and an effective dual network are proposed to characterize and interpret the asymptotic solution.Comment: Some typos in the version publised in the IEEE Transactions on Wireless Communications are correcte
    corecore