44 research outputs found

    Optimization models for resource management in two-tier cellular networks

    Get PDF
    Macro-femtocell network is the most promising two-tier architecture for the cellular network operators because it can improve their current network capacity without additional costs. Nevertheless, the incorporation of femtocells to the existing cellular networks needs to be finely tuned in order to enhance the usage of the limited wireless resources, because the femtocells operate in the same spectrum as the macrocell. In this thesis, we address the resource optimization problem for the OFDMA two-tier networks for scenarios where femtocells are deployed using hybrid access policy. The hybrid access policy is a technique that could provide different levels of service to authorized users and visitors to the femtocell. This method reduces interference received by femtocell subscribers by granting access to nearby public users. These approaches should find a compromise between the level of access granted to public users and the impact on the subscribers satisfaction. This impact should be reduced in terms of performance or through economic compensation. In this work, two specific issues of an OFDMA two-tier cellular network are addressed. The first is the trade-off between macrocell resource usage efficiency and the fairness of the resource distribution among macro mobile users and femtocells. The second issue is the compromise between interference mitigation and granting access to public users without depriving the subscriber downlink transmissions. We tackle these issues by developing several resource allocation models for non-dense and dense femtocell deployment using Linear Programming and one evolutionary optimization method. In addition, the proposed resource allocation models determine the best suitable serving base station together with bandwidth and transmitted power per user in order to enhance the overall network capacity. The first two parts of this work cope with the resource optimization for non-dense deployment using orthogonal and co-channel allocation. Both parts aim at the maximization of the sum of the weighted user data rates. In the first part, several set of weights are introduced to prioritize the use of femtocells for subscribers and public users close to femtocells. In addition, macrocell power control is incorporated to enhance the power distribution among the active downlink transmissions and to improve the tolerance to the environmental noise. The second part enables the spectral reuse and the power adaptation is a three-folded solution that enhances the power distribution over the active downlink transmissions, improves the tolerance to the environmental noise and a given interference threshold, and achieves the target Quality of Service (QoS). To reduce the complexity of the resource optimization problem for dense deployment, the third part of this work divides the optimization problem into subproblems. The main idea is to divide the user and FC sets into disjoint sets taking into account their locations. Thus, the optimization problem can be solved independently in each OFDMA zone. This solution allows the subcarriers reuse among inner macrocell zones and femtocells located in outer macrocell zones and also between femtocells belonging to different clusters if they are located in the same zone. Macrocell power control is performed to avoid the cross-tier interference among macrocell inner zones and inside femtocells located in outer zones. Another well known method used to reduce the complexity of the resource optimization problem is the femtocell clustering. However, finding the optimal cluster configuration together with the resource allocation is a complex optimization problem due to variable number related to the possible cluster configurations. Therefore, the part four of this work deals with a heuristic cluster based resource allocation model and a motivation scheme for femtocell clustering through the allocation of extra resources for subscriber and “visitor user” transmissions. The cluster based resource allocation model maximizes the network throughput while keeping balanced clusters and minimizing the inter-cluster interference. Finally, the proposed solutions are evaluated through extensive numerical simulations and the numerical results are presented to provide a comparison with the related works found in the literature

    A survey of machine learning techniques applied to self organizing cellular networks

    Get PDF
    In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future

    Review on Radio Resource Allocation Optimization in LTE/LTE-Advanced using Game Theory

    Get PDF
    Recently, there has been a growing trend toward ap-plying game theory (GT) to various engineering fields in order to solve optimization problems with different competing entities/con-tributors/players. Researches in the fourth generation (4G) wireless network field also exploited this advanced theory to overcome long term evolution (LTE) challenges such as resource allocation, which is one of the most important research topics. In fact, an efficient de-sign of resource allocation schemes is the key to higher performance. However, the standard does not specify the optimization approach to execute the radio resource management and therefore it was left open for studies. This paper presents a survey of the existing game theory based solution for 4G-LTE radio resource allocation problem and its optimization

    Performance evaluation of the dynamic trajectory design for an unmanned aerial base station in a single frequency network

    Get PDF
    Using an Unmanned Aerial Base Station (UABS) i.e., a base station carried by a UAV (Unmanned Aerial Vehicle) or drone, is a promising approach to offer coverage and capacity to those users that are not being served by the base stations of the terrestrial network. In this paper, we propose an approach to the design of the drone's trajectory to account for the quickly varying user traffic and pattern. This approach is based on the identification of clusters made of nearby users to be served. The decision on which cluster to visit next by the UABS depends on a cost-function considering the distance to the next cluster, the user density and spread in the cluster, and the direction compared to the previously visited cluster. Furthermore, we propose a radio resource assignment algorithm to minimize the interference from the UABS to the terrestrial network when both are operating in the same frequency band. The potential improvements in terms of network capacity (sum throughput) and user satisfaction are estimated in this study

    Drone Base Station Trajectory Management for Optimal Scheduling in LTE-Based Sparse Delay-Sensitive M2M Networks

    Get PDF
    Providing connectivity in areas out of reach of the cellular infrastructure is a very active area of research. This connectivity is particularly needed in case of the deployment of machine type communication devices (MTCDs) for critical purposes such as homeland security. In such applications, MTCDs are deployed in areas that are hard to reach using regular communications infrastructure while the collected data is timely critical. Drone-supported communications constitute a new trend in complementing the reach of the terrestrial communication infrastructure. In this study, drones are used as base stations to provide real-time communication services to gather critical data out of a group of MTCDs that are sparsely deployed in a marine environment. Studying different communication technologies as LTE, WiFi, LPWAN and Free-Space Optical communication (FSOC) incorporated with the drone communications was important in the first phase of this research to identify the best candidate for addressing this need. We have determined the cellular technology, and particularly LTE, to be the most suitable candidate to support such applications. In this case, an LTE base station would be mounted on the drone which will help communicate with the different MTCDs to transmit their data to the network backhaul. We then formulate the problem model mathematically and devise the trajectory planning and scheduling algorithm that decides the drone path and the resulting scheduling. Based on this formulation, we decided to compare between an Ant Colony Optimization (ACO) based technique that optimizes the drone movement among the sparsely-deployed MTCDs and a Genetic Algorithm (GA) based solution that achieves the same purpose. This optimization is based on minimizing the energy cost of the drone movement while ensuring the data transmission deadline missing is minimized. We present the results of several simulation experiments that validate the different performance aspects of the technique

    A Survey and Future Directions on Clustering: From WSNs to IoT and Modern Networking Paradigms

    Get PDF
    Many Internet of Things (IoT) networks are created as an overlay over traditional ad-hoc networks such as Zigbee. Moreover, IoT networks can resemble ad-hoc networks over networks that support device-to-device (D2D) communication, e.g., D2D-enabled cellular networks and WiFi-Direct. In these ad-hoc types of IoT networks, efficient topology management is a crucial requirement, and in particular in massive scale deployments. Traditionally, clustering has been recognized as a common approach for topology management in ad-hoc networks, e.g., in Wireless Sensor Networks (WSNs). Topology management in WSNs and ad-hoc IoT networks has many design commonalities as both need to transfer data to the destination hop by hop. Thus, WSN clustering techniques can presumably be applied for topology management in ad-hoc IoT networks. This requires a comprehensive study on WSN clustering techniques and investigating their applicability to ad-hoc IoT networks. In this article, we conduct a survey of this field based on the objectives for clustering, such as reducing energy consumption and load balancing, as well as the network properties relevant for efficient clustering in IoT, such as network heterogeneity and mobility. Beyond that, we investigate the advantages and challenges of clustering when IoT is integrated with modern computing and communication technologies such as Blockchain, Fog/Edge computing, and 5G. This survey provides useful insights into research on IoT clustering, allows broader understanding of its design challenges for IoT networks, and sheds light on its future applications in modern technologies integrated with IoT.acceptedVersio

    PB-NTP-04

    Get PDF
    corecore