553 research outputs found

    Control-data separation architecture for cellular radio access networks: a survey and outlook

    Get PDF
    Conventional cellular systems are designed to ensure ubiquitous coverage with an always present wireless channel irrespective of the spatial and temporal demand of service. This approach raises several problems due to the tight coupling between network and data access points, as well as the paradigm shift towards data-oriented services, heterogeneous deployments and network densification. A logical separation between control and data planes is seen as a promising solution that could overcome these issues, by providing data services under the umbrella of a coverage layer. This article presents a holistic survey of existing literature on the control-data separation architecture (CDSA) for cellular radio access networks. As a starting point, we discuss the fundamentals, concepts, and general structure of the CDSA. Then, we point out limitations of the conventional architecture in futuristic deployment scenarios. In addition, we present and critically discuss the work that has been done to investigate potential benefits of the CDSA, as well as its technical challenges and enabling technologies. Finally, an overview of standardisation proposals related to this research vision is provided

    On the security of software-defined next-generation cellular networks

    Get PDF
    In the recent years, mobile cellular networks are ndergoing fundamental changes and many established concepts are being revisited. Future 5G network architectures will be designed to employ a wide range of new and emerging technologies such as Software Defined Networking (SDN) and Network Functions Virtualization (NFV). These create new virtual network elements each affecting the logic of the network management and operation, enabling the creation of new generation services with substantially higher data rates and lower delays. However, new security challenges and threats are also introduced. Current Long-Term Evolution (LTE) networks are not able to accommodate these new trends in a secure and reliable way. At the same time, novel 5G systems have proffered invaluable opportunities of developing novel solutions for attack prevention, management, and recovery. In this paper, first we discuss the main security threats and possible attack vectors in cellular networks. Second, driven by the emerging next-generation cellular networks, we discuss the architectural and functional requirements to enable appropriate levels of security

    Load balancing using cell range expansion in LTE advanced heterogeneous networks

    Get PDF
    The use of heterogeneous networks is on the increase, fueled by consumer demand for more data. The main objective of heterogeneous networks is to increase capacity. They offer solutions for efficient use of spectrum, load balancing and improvement of cell edge coverage amongst others. However, these solutions have inherent challenges such as inter-cell interference and poor mobility management. In heterogeneous networks there is transmit power disparity between macro cell and pico cell tiers, which causes load imbalance between the tiers. Due to the conventional user-cell association strategy, whereby users associate to a base station with the strongest received signal strength, few users associate to small cells compared to macro cells. To counter the effects of transmit power disparity, cell range expansion is used instead of the conventional strategy. The focus of our work is on load balancing using cell range expansion (CRE) and network utility optimization techniques to ensure fair sharing of load in a macro and pico cell LTE Advanced heterogeneous network. The aim is to investigate how to use an adaptive cell range expansion bias to optimize Pico cell coverage for load balancing. Reviewed literature points out several approaches to solve the load balancing problem in heterogeneous networks, which include, cell range expansion and utility function optimization. Then, we use cell range expansion, and logarithmic utility functions to design a load balancing algorithm. In the algorithm, user and base station associations are optimized by adapting CRE bias to pico base station load status. A price update mechanism based on a suboptimal solution of a network utility optimization problem is used to adapt the CRE bias. The price is derived from the load status of each pico base station. The performance of the algorithm was evaluated by means of an LTE MATLAB toolbox. Simulations were conducted according to 3GPP and ITU guidelines for modelling heterogeneous networks and propagation environment respectively. Compared to a static CRE configuration, the algorithm achieved more fairness in load distribution. Further, it achieved a better trade-off between cell edge and cell centre user throughputs. [Please note: this thesis file has been deferred until December 2016

    A survey of online data-driven proactive 5G network optimisation using machine learning

    Get PDF
    In the fifth-generation (5G) mobile networks, proactive network optimisation plays an important role in meeting the exponential traffic growth, more stringent service requirements, and to reduce capitaland operational expenditure. Proactive network optimisation is widely acknowledged as on e of the most promising ways to transform the 5G network based on big data analysis and cloud-fog-edge computing, but there are many challenges. Proactive algorithms will require accurate forecasting of highly contextualised traffic demand and quantifying the uncertainty to drive decision making with performance guarantees. Context in Cyber-Physical-Social Systems (CPSS) is often challenging to uncover, unfolds over time, and even more difficult to quantify and integrate into decision making. The first part of the review focuses on mining and inferring CPSS context from heterogeneous data sources, such as online user-generated-content. It will examine the state-of-the-art methods currently employed to infer location, social behaviour, and traffic demand through a cloud-edge computing framework; combining them to form the input to proactive algorithms. The second part of the review focuses on exploiting and integrating the demand knowledge for a range of proactive optimisation techniques, including the key aspects of load balancing, mobile edge caching, and interference management. In both parts, appropriate state-of-the-art machine learning techniques (including probabilistic uncertainty cascades in proactive optimisation), complexity-performance trade-offs, and demonstrative examples are presented to inspire readers. This survey couples the potential of online big data analytics, cloud-edge computing, statistical machine learning, and proactive network optimisation in a common cross-layer wireless framework. The wider impact of this survey includes better cross-fertilising the academic fields of data analytics, mobile edge computing, AI, CPSS, and wireless communications, as well as informing the industry of the promising potentials in this area

    zCap: a zero configuration adaptive paging and mobility management mechanism

    Get PDF
    Today, cellular networks rely on fixed collections of cells (tracking areas) for user equipment localisation. Locating users within these areas involves broadcast search (paging), which consumes radio bandwidth but reduces the user equipment signalling required for mobility management. Tracking areas are today manually configured, hard to adapt to local mobility and influence the load on several key resources in the network. We propose a decentralised and self-adaptive approach to mobility management based on a probabilistic model of local mobility. By estimating the parameters of this model from observations of user mobility collected online, we obtain a dynamic model from which we construct local neighbourhoods of cells where we are most likely to locate user equipment. We propose to replace the static tracking areas of current systems with neighbourhoods local to each cell. The model is also used to derive a multi-phase paging scheme, where the division of neighbourhood cells into consecutive phases balances response times and paging cost. The complete mechanism requires no manual tracking area configuration and performs localisation efficiently in terms of signalling and response times. Detailed simulations show that significant potential gains in localisation effi- ciency are possible while eliminating manual configuration of mobility management parameters. Variants of the proposal can be implemented within current (LTE) standards

    Resource allocation in mobile edge cloud computing for data-intensive applications

    Get PDF
    Rapid advancement in the mobile telecommunications industry has motivated the development of mobile applications in a wide range of social and scientific domains. However, mobile computing (MC) platforms still have several constraints, such as limited computation resources, short battery life and high sensitivity to network capabilities. In order to overcome the limitations of mobile computing and benefit from the huge advancement in mobile telecommunications and the rapid revolution of distributed resources, mobile-aware computing models, such as mobile cloud computing (MCC) and mobile edge computing (MEC) have been proposed. The main problem is to decide on an application execution plan while satisfying quality of service (QoS) requirements and the current status of system networking and device energy. However, the role of application data in offloading optimisation has not been studied thoroughly, particularly with respect to how data size and distribution impact application offloading. This problem can be referred to as data-intensive mobile application offloading optimisation. To address this problem, this thesis presents novel optimisation frameworks, techniques and algorithms for mobile application resource allocation in mobile-aware computing environments. These frameworks and techniques are proposed to provide optimised solutions to schedule data intensive mobile applications. Experimental results show the ability of the proposed tools in optimising the scheduling and the execution of data intensive applications on various computing environments to meet application QoS requirements. Furthermore, the results clearly stated the significant contribution of the data size parameter on scheduling the execution of mobile applications. In addition, the thesis provides an analytical investigation of mobile-aware computing environments for a certain mobile application type. The investigation provides performance analysis to help users decide on target computation resources based on application structure, input data, and mobile network status

    Integration of a genetic optimisation algorithm in a simulation framework for optimising femtocell networks.

    Get PDF
    The developments in mobile communication systems from 1G to 4G have increased demands on the network due to the increased number of devices and increasing volume of data and 5G is expected to significantly increase demands further. Therefore, networks need to be more efficient to deliver the expected increase in volume. An energy and cost efficient way to cope with such an anticipated increase in the demand of voice and data is the dense deployment of small cells i.e. femtocells. Femtocells are identified as a crucial way to the delivery of the increased demands for heterogeneous networks in which macrocells work in combination with femtocells to provide coverage to offices, homes and enterprise. A survey of the literature is conducted to examine the mechanisms and approaches different authors have used to optimise the network. One of the major activities in this project before the transfer was the identification of the parameters. The literature was analysed and key performance parameters were identified. Based on the identified key performance parameters, a simulation framework is used to perform the experiments and to analyse the performance of a two-tier LTE-A system having femtocell overlays. A comprehensive and easy to use graphical user interface has been set up with the desired two- tier network topologies. It estimates the throughput and path loss of all the femto and macro users for all the supported bandwidths of an LTE-A system using different modulation schemes. A series of tests are carried out using the described simulation framework for a range of scenarios. The modulation scheme that yield highest throughput for a femtocell user is identified, and path loss is found to be independent from the modulation scheme but is dependent on the distance from its base station. In another series of experiments, the effects that walls inside buildings have on connectivity are examined and positioning of the femtocells is changed for each scenario inside buildings to analyse the performance. These results are used to find the optimised location of femtocells in different room layouts of the building. The simulation framework is further developed to be able to optimise the whole femtocell network by finding the optimised positioning of femtocells using the genetic optimisation algorithm. The end user can provide the inputs of the desired network topology to the simulation framework through a graphical user interface. The throughput and path loss of all the femto users are calculated before and after optimisation. The simulation results are generated in the form of tables before and after optimisation for comparison and analysis. The layouts depicting the indoor environment of the building before and after optimisation can be seen and analysed through the graphical user interface developed as a part of this simulation framework. Two case studies are defined and described to test the capacity and capability of the developed simulation framework and to show how the simulation framework can be used to identify the optimum positions of the femtocells under different configurations of room designs and number of users that represent contrasting loads on the network. Any desired network topology can be created and analysed on the basis of throughput and path loss by using this simulation framework to optimise the femtocell networks in an indoor environment of the building. The results of the experiments are compared against the claims in other published research

    State-Of-The-Art and Prospects for Peer-To-Peer Transaction-Based Energy System

    Get PDF
    Transaction-based energy (TE) management and control has become an increasingly relevant topic, attracting considerable attention from industry and the research community alike. As a result, new techniques are emerging for its development and actualization. This paper presents a comprehensive review of TE involving peer-to-peer (P2P) energy trading and also covering the concept, enabling technologies, frameworks, active research efforts and the prospects of TE. The formulation of a common approach for TE management modelling is challenging given the diversity of circumstances of prosumers in terms of capacity, profiles and objectives. This has resulted in divergent opinions in the literature. The idea of this paper is therefore to explore these viewpoints and provide some perspectives on this burgeoning topic on P2P TE systems. This study identified that most of the techniques in the literature exclusively formulate energy trade problems as a game, an optimization problem or a variational inequality problem. It was also observed that none of the existing works has considered a unified messaging framework. This is a potential area for further investigation
    corecore