210 research outputs found
Particle Swarm Optimization for Mobility Load Balancing SON in LTE Networks
This paper presents a self-optimizing solution for Mobility Load Balancing
(MLB). The MLB-SON is performed in two phases. In the first, a MLB controller
is designed using Multi-Objective Particle Swarm Optimization (MO-PSO) which
incorporates a priori expert knowledge to considerably reduce the search space
and optimization time. The dynamicity of the optimization phase is addressed.
In the second phase, the controller is pushed into the base stations to
implement the MLB SON. The method is applied to dynamically adapt Handover
Margin parameters of a large scale LTE network in order to balance traffic of
the network eNodeBs. Numerical results illustrate the benefits of the proposed
solution
Optimization of Mobility Parameters using Fuzzy Logic and Reinforcement Learning in Self-Organizing Networks
In this thesis, several optimization techniques for next-generation wireless networks are proposed to solve different problems in the field of Self-Organizing Networks and heterogeneous networks. The common basis of these problems is that network parameters are automatically tuned to deal with the specific problem. As the set of network parameters is extremely large, this work mainly focuses on parameters involved in mobility management. In addition, the proposed self-tuning schemes are based on Fuzzy Logic Controllers (FLC), whose potential lies in the capability to express the knowledge in a similar way to the human perception and reasoning. In addition, in those cases in which a mathematical approach has been required to optimize the behavior of the FLC, the selected solution has been Reinforcement Learning, since this methodology is especially appropriate for learning from interaction, which becomes essential in complex systems such as wireless networks.
Taking this into account, firstly, a new Mobility Load Balancing (MLB) scheme is proposed to solve persistent congestion problems in next-generation wireless networks, in particular, due to an uneven spatial traffic distribution, which typically leads to an inefficient usage of resources. A key feature of the proposed algorithm is that not only the parameters are optimized, but also the parameter tuning strategy. Secondly, a novel MLB algorithm for enterprise femtocells scenarios is proposed. Such scenarios are characterized by the lack of a thorough deployment of these low-cost nodes, meaning that a more efficient use of radio resources can be achieved by applying effective MLB schemes. As in the previous problem, the optimization of the self-tuning process is also studied in this case. Thirdly, a new self-tuning algorithm for Mobility Robustness Optimization (MRO) is proposed. This study includes the impact of context factors such as the system load and user speed, as well as a proposal for coordination between the designed MLB and MRO functions. Fourthly, a novel self-tuning algorithm for Traffic Steering (TS) in heterogeneous networks is proposed. The main features of the proposed algorithm are the flexibility to support different operator policies and the adaptation capability to network variations. Finally, with the aim of validating the proposed techniques, a dynamic system-level simulator for Long-Term Evolution (LTE) networks has been designed
A survey of machine learning techniques applied to self organizing cellular networks
In this paper, a survey of the literature of the past fifteen years involving Machine Learning (ML) algorithms applied to self organizing cellular networks is performed. In order for future networks to overcome the current limitations and address the issues of current cellular systems, it is clear that more intelligence needs to be deployed, so that a fully autonomous and flexible network can be enabled. This paper focuses on the learning perspective of Self Organizing Networks (SON) solutions and provides, not only an overview of the most common ML techniques encountered in cellular networks, but also manages to classify each paper in terms of its learning solution, while also giving some examples. The authors also classify each paper in terms of its self-organizing use-case and discuss how each proposed solution performed. In addition, a comparison between the most commonly found ML algorithms in terms of certain SON metrics is performed and general guidelines on when to choose each ML algorithm for each SON function are proposed. Lastly, this work also provides future research directions and new paradigms that the use of more robust and intelligent algorithms, together with data gathered by operators, can bring to the cellular networks domain and fully enable the concept of SON in the near future
Recommended from our members
QoS-Aware dynamic RRH allocation in a Self-Optimised cloud radio access network with RRH proximity constraint
An inefficient utilisation of network resources in a
time-varying traffic environment often leads to load imbalances,
high call-blocking events and degraded Quality of Service
(QoS). This paper optimises the QoS of a Cloud Radio Access
Network (C-RAN) by investigating load balancing solutions.
The dynamic re-mapping ability of C-RAN is exploited to
configure the Remote Radio Heads (RRHs) to proper Base
Band Unit (BBU) sectors in a time-varying traffic environment.
RRH-sector configuration redistributes the network capacity
over a given geographical area. A Self-Optimised Cloud
Radio Access Network (SOCRAN) is considered to enhance
the network QoS by traffic load balancing with minimum
possible handovers in the network. QoS is formulated as an
optimisation problem by defining it as a weighted combination
of new key performance indicators (KPIs) for the number
of blocked users and handovers in the network subject to
RRH sectorisation constraint. A Genetic Algorithm (GA) and
Discrete Particle Swarm Optimisation (DPSO) are proposed
as evolutionary algorithms to solve the optimisation problem.
Computational results based on three benchmark problems
demonstrate that GA and DPSO deliver optimum performance
for small networks, whereas close-optimum is delivered for large
networks. The results of both GA and DPSO are compared to
Exhaustive Search (ES) and K-mean clustering algorithms. The
percentage of blocked users in a medium sized network scenario
is reduced from 10.523% to 0.421% and 0.409% by GA and
DPSO, respectively. Also in a vast network scenario, the blocked
users are reduced from 5.394% to 0.611% and 0.56% by GA
and DPSO, respectively. The DPSO outperforms GA regarding
execution, convergence, complexity, and achieving higher levels
of QoS with fewer iterations to minimise both handovers and
blocked users. Furthermore, a trade-off between two critical
parameters for the SOCRAN algorithm is presented, to achieve
performance benefits based on the type of hardware utilised for
C-RAN
Intelligent Advancements in Location Management and C-RAN Power-Aware Resource Allocation
The evolving of cellular networks within the last decade continues to focus on delivering a robust and reliable means to cope with the increasing number of users and demanded capacity. Recent advancements of cellular networks such as Long-Term Evolution (LTE) and LTE-advanced offer a remarkable high bandwidth connectivity delivered to the users. Signalling overhead is one of the vital issues that impact the cellular behavior. Causing a significant load in the core network hence effecting the cellular network reliability. Moreover, the signaling overhead decreases the Quality of Experience (QoE) of users. The first topic of the thesis attempts to reduce the signaling overhead by developing intelligent location management techniques that minimize paging and Tracking Area Update (TAU) signals. Consequently, the corresponding optimization problems are formulated. Furthermore, several techniques and heuristic algorithms are implemented to solve the formulated problems. Additionally, network scalability has become a challenging aspect that has been hindered by the current network architecture. As a result, Cloud Radio Access Networks (C-RANs) have been introduced as a new trend in wireless technologies to address this challenge. C-RAN architecture consists of: Remote Radio Head (RRH), Baseband Unit (BBU), and the optical network connecting them. However, RRH-to-BBU resource allocation can cause a significant downgrade in efficiency, particularly the allocation of the computational resources in the BBU pool to densely deployed small cells. This causes a vast increase in the power consumption and wasteful resources. Therefore, the second topic of the thesis discusses C-RAN infrastructure, particularly where a pool of BBUs are gathered to process the computational resources. We argue that there is a need of optimizing the processing capacity in order to minimize the power consumption and increase the overall system efficiency. Consequently, the optimal allocation of computational resources between the RRHs and BBUs is modeled. Furthermore, in order to get an optimal RRH-to-BBU allocation, it is essential to have an optimal physical resource allocation for users to determine the required computational resources. For this purpose, an optimization problem that models the assignment of resources at these two levels (from physical resources to users and from RRHs to BBUs) is formulated
Theory of Algorithm Suitability on Managing Radio Resources in Next Generation Mobile Networks
Beyond 2020, wireless networking model will be radically changed and oriented to business-driven concept as foreseen by the next generation mobile network (NGMN) alliance. As the available spectrum granted to a given operator is physically limited, new radio resource management techniques are required to ensure massive connectivity for wireless devices. Given this situation, we investigate in this paper how the key network functionalities as self-optimizing network (SON) must be thought to meet NGMN requirements. We propose therefore, algorithm suitability theory (AST) combined to the notion of network operator infrastructure convergence. The approach is based on software-defined networking (SDN) principle that allows an adaptability of the load balance algorithm to the dynamic network status. Besides, we use the concept of network function virtualization (NFV) that alleviates the constraint of confining the wireless devices to their home network operator only. Relying on these two technologies, we build AST through a lexicographic optimality criterion based on SPC (Status, Performance, and Complexity) order. Numerical results demonstrate a better network coverage verified by the improvement of metrics such as call blocking rate, spectrum efficiency, energy efficiency and load balance index
Theory of Algorithm Suitability on Managing Radio Resources in Next Generation Mobile Networks
Beyond 2020, wireless networking model will be radically changed and oriented to business-driven concept as foreseen by the next generation mobile network (NGMN) alliance. As the available spectrum granted to a given operator is physically limited, new radio resource management techniques are required to ensure massive connectivity for wireless devices. Given this situation, we investigate in this paper how the key network functionalities as self-optimizing network (SON) must be thought to meet NGMN requirements. We propose therefore, algorithm suitability theory (AST) combined to the notion of network operator infrastructure convergence. The approach is based on software-defined networking (SDN) principle that allows an adaptability of the load balance algorithm to the dynamic network status. Besides, we use the concept of network function virtualization (NFV) that alleviates the constraint of confining the wireless devices to their home network operator only. Relying on these two technologies, we build AST through a lexicographic optimality criterion based on SPC (Status, Performance, and Complexity) order. Numerical results demonstrate a better network coverage verified by the improvement of metrics such as call blocking rate, spectrum efficiency, energy efficiency and load balance index
Energy Efficiency and Coverage Trade-Off in 5G for Eco-Friendly and Sustainable Cellular Networks
Recently, cellular networks’ energy efficiency has garnered research interest from academia
and industry because of its considerable economic and ecological effects in the near future. This study
proposes an approach to cooperation between the Long-Term Evolution (LTE) and next-generation
wireless networks. The fifth-generation (5G) wireless network aims to negotiate a trade-off between
wireless network performance (sustaining the demand for high speed packet rates during busy traffic
periods) and energy efficiency (EE) by alternating 5G base stations’ (BSs) switching off/on based
on the traffic instantaneous load condition and, at the same time, guaranteeing network coverage
for mobile subscribers by the remaining active LTE BSs. The particle swarm optimization (PSO)
algorithm was used to determine the optimum criteria of the active LTE BSs (transmission power,
total antenna gain, spectrum/channel bandwidth, and signal-to-interference-noise ratio) that achieves
maximum coverage for the entire area during the switch-off session of 5G BSs. Simulation results
indicate that the energy savings can reach 3.52 kW per day, with a maximum data rate of up to
22.4 Gbps at peak traffic hours and 80.64 Mbps during a 5G BS switched-off session along with
guaranteed full coverage over the entire region by the remaining active LTE BSs
- …