342 research outputs found

    Recent advances in radio resource management for heterogeneous LTE/LTE-A networks

    Get PDF
    As heterogeneous networks (HetNets) emerge as one of the most promising developments toward realizing the target specifications of Long Term Evolution (LTE) and LTE-Advanced (LTE-A) networks, radio resource management (RRM) research for such networks has, in recent times, been intensively pursued. Clearly, recent research mainly concentrates on the aspect of interference mitigation. Other RRM aspects, such as radio resource utilization, fairness, complexity, and QoS, have not been given much attention. In this paper, we aim to provide an overview of the key challenges arising from HetNets and highlight their importance. Subsequently, we present a comprehensive survey of the RRM schemes that have been studied in recent years for LTE/LTE-A HetNets, with a particular focus on those for femtocells and relay nodes. Furthermore, we classify these RRM schemes according to their underlying approaches. In addition, these RRM schemes are qualitatively analyzed and compared to each other. We also identify a number of potential research directions for future RRM development. Finally, we discuss the lack of current RRM research and the importance of multi-objective RRM studies

    QoS-aware Adaptive Resource Management in OFDMA Networks

    Get PDF
    PhDOne important feature of the future communication network is that users in the network are required to experience a guaranteed high quality of service (QoS) due to the popularity of multimedia applications. This thesis studies QoS-aware radio resource management schemes in different OFDMA network scenarios. Motivated by the fact that in current 4G networks, the QoS provisioning is severely constrained by the availability of radio resources, especially the scarce spectrum as well as the unbalanced traffic distribution from cell to cell, a joint antenna and subcarrier management scheme is proposed to maximise user satisfaction with load balancing. Antenna pattern update mechanism is further investigated with moving users. Combining network densi fication with cloud computing technologies, cloud radio access network (C-RAN) has been proposed as the emerging 5G network architecture consisting of baseband unit (BBU) pool, remote radio heads (RRHs) and fronthaul links. With cloud based information sharing through the BBU pool, a joint resource block and power allocation scheme is proposed to maximise the number of satisfi ed users whose required QoS is achieved. In this scenario, users are served by high power nodes only. With spatial reuse of system bandwidth by network densi fication, users' QoS provisioning can be ensured but it introduces energy and operating effciency issue. Therefore two network energy optimisation schemes with QoS guarantee are further studied for C-RANs: an energy-effective network deployment scheme is designed for C-RAN based small cells; a joint RRH selection and user association scheme is investigated in heterogeneous C-RAN. Thorough theoretical analysis is conducted in the development of all proposed algorithms, and the effectiveness of all proposed algorithms is validated via comprehensive simulations.China Scholarship Counci

    Efficient and Virtualized Scheduling for OFDM-Based High Mobility Wireless Communications Objects

    Get PDF
    Services providers (SPs) in the radio platform technology standard long term evolution (LTE) systems are enduring many challenges in order to accommodate the rapid expansion of mobile data usage. The modern technologies demonstrate new challenges to SPs, for example, reducing the cost of the capital and operating expenditures while supporting high data throughput per customer, extending battery life-per-charge of the cell phone devices, and supporting high mobility communications with fast and seamless handover (HO) networking architecture. In this thesis, a variety of optimized techniques aimed at providing innovative solutions for such challenges are explored. The thesis is divided into three parts. The first part outlines the benefits and challenges of deploying virtualized resource sharing concept. Wherein, SPs achieving a different schedulers policy are sharing evolved network B, allowing SPs to customize their efforts and provide service requirements; as a promising solution for reducing operational and capital expenditures, leading to potential energy savings, and supporting higher peak rates. The second part, formulates the optimized power allocation problem in a virtualized scheme in LTE uplink systems, aiming to extend the mobile devices’ battery utilization time per charge. While, the third part extrapolates a proposed hybrid-HO (HY-HO) technique, that can enhance the system performance in terms of latency and HO reliability at cell boundary for high mobility objects (up to 350 km/hr; wherein, HO will occur more frequent). The main contributions of this thesis are in designing optimal binary integer programmingbased and suboptimal heuristic (with complexity reduction) scheduling algorithms subject to exclusive and contiguous allocation, maximum transmission power, and rate constraints. Moreover, designing the HY-HO based on the combination of soft and hard HO was able to enhance the system performance in term of latency, interruption time and reliability during HO. The results prove that the proposed solutions effectively contribute in addressing the challenges caused by the demand for high data rates and power transmission in mobile networks especially in virtualized resources sharing scenarios that can support high data rates with improving quality of services (QoSs)

    A new genetic algorithm based scheduling algorithm for the LTE Uplink

    Get PDF
    Tese (Doutorado)Long Term Evolution has become the de facto technology for the 4G networks. It aims to deliver unprecedented data transmission rates and low latency for several types of applications and services. In this context, this thesis investigates the resource allocation in the LTE uplink. From the principle that resource allocation in the uplink is a complex optimization problem, the main contribution of this thesis is a novel scheduling algorithm based on Genetic Algorithms (GA). This algorithm introduces new operations of initialization, crossover, mutation and a QoS-aware fitness function. The algorithm is evaluated in a mixed traffic environment and its performance is compared with relevant algorithms from the literature. Simulations were carried out in ns-3 and the results show that the proposed algorithm is able to meet the Quality of Service (QoS) requirements of the applications, while presenting a satisfactory execution time

    Performances of LTE networks

    Get PDF
    Poussé par la demande croissante de services à haut débit sans fil, Long Term Evolution (LTE) a émergé comme une solution prometteuse pour les communications mobiles. Dans plusieurs pays à travers le monde, la mise en oeuvre de LTE est en train de se développer. LTE offre une architecture tout-IP qui fournit des débits élevés et permet une prise en charge efficace des applications de type multimédia. LTE est spécifié par le 3GPP ; cette technologie fournit une architecture capable de mettre en place des mécanismes pour traiter des classes de trafic hétérogènes comme la voix, la vidéo, les transferts de fichier, les courriers électroniques, etc. Ces classes de flux hétérogènes peuvent être gérées en fonction de la qualité de service requise mais aussi de la qualité des canaux et des conditions environnementales qui peuvent varier considérablement sur une courte échelle de temps. Les standards du 3GPP ne spécifient pas l’algorithmique de l’allocation des ressources du réseau d’accès, dont l’importance est grande pour garantir performance et qualité de service (QoS). Dans cette thèse, nous nous focalisons plus spécifiquement sur la QoS de LTE sur la voie descendante. Nous nous concentrons alors sur la gestion des ressources et l’ordonnancement sur l’interface radio des réseaux d’accès. Dans une première partie, nous nous sommes intéressés à des contextes de macro-cellules. Le premier mécanisme proposé pour l’allocation des ressources combine une méthode de jetons virtuels et des ordonnanceurs opportunistes. Les performances obtenues sont très bonnes mais n’assurent pas une très bonne équité. Notre seconde proposition repose sur la théorie des jeux, et plus spécifiquement sur la valeur de Shapley, pour atteindre un haut niveau d’équité entre les différentes classes de services au détriment de la qualité de service. Cela nous a poussé, dans un troisième mécanisme, à combiner les deux schémas. La deuxième partie de la thèse est consacrée aux femto-cellules (ou femtocells) qui offrent des compléments de couverture appréciables. La difficulté consiste alors à étudier et à minimiser les interférences. Notre premier mécanisme d’atténuation des interférences est fondé sur le contrôle de la puissance de transmission. Il fonctionne en utilisant la théorie des jeux non coopératifs. On effectue une négociation constante entre le débit et les interférences pour trouver un niveau optimal de puissance d’émission. Le second mécanisme est centralisé et utilise une approche de division de la bande passante afin d’obliger les femtocells à ne pas utiliser les mêmes sous-bandes évitant ainsi les interférences. Le partage de bande passante et l’allocation sont effectués en utilisant sur la théorie des jeux (valeur de Shapley) et en tenant compte du type d’application. Ce schéma réduit les interférences considérablement. Tous les mécanismes proposés ont été testés et évalués dans un environnement de simulation en utilisant l’outil LTE-Sim au développement duquel nous avons contribué. ABSTRACT : Driven by the growing demand for high-speed broadband wireless services, Long term Evolution (LTE) technology has emerged as a competitive alternative to mobile communications solution. In several countries around the world, the implementation of LTE has started. LTE offers an IP-based framework that provides high data rates for multimedia applications. Moreover, based on the 3GPP specifications, the technology provides a set of built in mechanisms to support heterogeneous classes of traffic including data, voice and video, etc. Supporting heterogeneous classes of services means that the traffic is highly diverse and has distinct QoS parameters, channel and environmental conditions may vary dramatically on a short time scale. The 3GPP specifications leave unstandardized the resource management and scheduling mechanisms which are crucial components to guarantee the QoS performance for the services. In this thesis, we evaluate the performance and QoS in LTE technology. Moreover, our research addresses the resource management and scheduling issues on the wireless interface. In fact, after surveying, classifying and comparing different scheduling mechanisms, we propose three QoS mechanisms for resource allocation in macrocell scenarios focused on real time services and two mechanisms for interference mitigation in femtocell scenarios taking into account the QoS of real time services. Our first proposed mechanism for resource allocation in macrocell scenarios combines the well known virtual token (or token buckets) method with opportunistic schedulers, our second scheme utilizes game theory, specifically the Shapley value in order to achieve a higher fairness level among classes of services and our third mechanism combines the first and the second proposed schemes. Our first mechanism for interference mitigation in femtocell scenarios is power control based and works by using non cooperative games. It performs a constant bargain between throughput and SINR to find out the optimal transmit power level. The second mechanism is centralised, it uses a bandwidth division approach in order to not use the same subbands to avoid interference. The bandwidth division and assignation is performed based on game theory (Shapley value) taking into account the application bitrate . This scheme reduces interference considerably and shows an improvement compared to other bandwidth division schemes. All proposed mechanism are performed in a LTE simulation environment. several constraints such as throughput, Packet Loss Ratio, delay, fairness index, SINR are used to evaluate the efficiency of our scheme

    Resource allocation technique for powerline network using a modified shuffled frog-leaping algorithm

    Get PDF
    Resource allocation (RA) techniques should be made efficient and optimized in order to enhance the QoS (power & bit, capacity, scalability) of high-speed networking data applications. This research attempts to further increase the efficiency towards near-optimal performance. RA’s problem involves assignment of subcarriers, power and bit amounts for each user efficiently. Several studies conducted by the Federal Communication Commission have proven that conventional RA approaches are becoming insufficient for rapid demand in networking resulted in spectrum underutilization, low capacity and convergence, also low performance of bit error rate, delay of channel feedback, weak scalability as well as computational complexity make real-time solutions intractable. Mainly due to sophisticated, restrictive constraints, multi-objectives, unfairness, channel noise, also unrealistic when assume perfect channel state is available. The main goal of this work is to develop a conceptual framework and mathematical model for resource allocation using Shuffled Frog-Leap Algorithm (SFLA). Thus, a modified SFLA is introduced and integrated in Orthogonal Frequency Division Multiplexing (OFDM) system. Then SFLA generated random population of solutions (power, bit), the fitness of each solution is calculated and improved for each subcarrier and user. The solution is numerically validated and verified by simulation-based powerline channel. The system performance was compared to similar research works in terms of the system’s capacity, scalability, allocated rate/power, and convergence. The resources allocated are constantly optimized and the capacity obtained is constantly higher as compared to Root-finding, Linear, and Hybrid evolutionary algorithms. The proposed algorithm managed to offer fastest convergence given that the number of iterations required to get to the 0.001% error of the global optimum is 75 compared to 92 in the conventional techniques. Finally, joint allocation models for selection of optima resource values are introduced; adaptive power and bit allocators in OFDM system-based Powerline and using modified SFLA-based TLBO and PSO are propose

    Fair resource allocation with interference mitigation and resource reuse for LTE/LTE-A femtocell networks

    Get PDF
    Joint consideration of interference, resource utilization, fairness, and complexity issues is generally lacking in existing resource allocation schemes for Long-Term Evolution (LTE)/LTE-Advanced femtocell networks. To tackle this, we employ a hybrid spectrum allocation approach whereby the spectrum is split between the macrocell and its nearby interfering femtocells based on their resource demands, whereas the distant femtocells share the entire spectrum. A multiobjective problem is formulated for resource allocation between femtocells and is decomposed using a lexicographic optimization approach into two subproblems. A greedy algorithm of reasonably low complexity is proposed to solve these subproblems sequentially. Simulation results show that the proposed scheme achieves substantial throughput and packet loss improvements in low-density femtocell deployment scenarios while performing satisfactorily in high-density femtocell deployment scenarios with substantial complexity and overhead reduction. The proposed scheme also performs nearly as well as the optimal solution obtained by exhaustive search

    Efficient Scheduling Algorithms for Wireless Resource Allocation and Virtualization in Wireless Networks

    Get PDF
    The continuing growth in demand for better mobile broadband experiences has motivated rapid development of radio-access technologies to support high data rates and improve quality of service (QoS) and quality of experience (QoE) for mobile users. However, the modern radio-access technologies pose new challenges to mobile network operators (MNO) and wireless device designers such as reducing the total cost of ownership while supporting high data throughput per user, and extending battery life-per-charge of the mobile devices. In this thesis, a variety of optimization techniques aimed at providing innovative solutions for such challenges are explored. The thesis is divided into two parts. In the first part, the challenge of extending battery life-per-charge is addressed. Optimal and suboptimal power-efficient schedulers that minimize the total transmit power and meet the QoS requirements of the users are presented. The second outlines the benefits and challenges of deploying wireless resource virtualization (WRV) concept as a promising solution for satisfying the growing demand for mobile data and reducing capital and operational costs. First, a WRV framework is proposed for single cell zone that is able to centralize and share the spectrum resources between multiple MNOs. Consequently, several WRV frameworks are proposed, which virtualize the spectrum resource of the entire network for cloud radio access network (C-RAN)- one of the front runners for the next generation network architecture. The main contributions of this thesis are in designing optimal and suboptimal solutions for the aforementioned challenges. In most cases, the optimal solutions suffer from high complexity, and therefore low-complexity suboptimal solutions are provided for practical systems. The optimal solutions are used as benchmarks for evaluating the suboptimal solutions. The results prove that the proposed solutions effectively contribute in addressing the challenges caused by the demand for high data rates and power transmission in mobile networks
    • …
    corecore