6 research outputs found

    Hard handover for load balancing in long term evolution network

    Get PDF
    This thesis presents a hard handover for load balancing in Long Term Evolution (LTE) network. LTE is a cellular self-organizing network (SON) standardized by Third Generation Project (3GPP) to optimally provide high data rate and high quality of service to end users. However, the huge amount of data requirements for the diverse multimedia services by LTE subscribers is fast affecting the network’s quality of service (QoS) negatively. On the other hand, the need for an optimized energy consumption algorithm to reduce the network access cost and optimize the battery life of the user’s equipment (UE) is also on the increase. Therefore, the main aim of this thesis is to provide a new solution for load control as well as providing energy efficient solution for both the network and the mobile devices. In the first contribution, a new network-energy efficient handover decision algorithm for load balancing is developed. The algorithm uses load information and reference signal received power (RSRP) as decision parameters for the handover decision scheme. The second contribution focuses on the development of an optimized handover decision algorithm for the load balancing and ping-pong control. The algorithm uses the cell load information, the received signal strength (RSS) and an adaptive timer as inputs for the handover decision procedure. Besides, the third contribution is on the development of a handover decision algorithm to optimize the UEs energy consumption as well as load balancing optimization. Overall, key performance indicators such as load distribution index (LDI), number of unsatisfied users (NUU), cumulative number of ping-pong handover request (CNPHR), cumulative number of non-ping-pong handover request (CNNPHR), average throughput of the cell (ATC), handover blocking rate (HBR), new call blocking rate (NCBR) and number of handover calls (NHC) were evaluated through simulations. The results were compared with some other works in the literature. In particular, the proposed algorithm achieved over 10% higher for LDI, 50% lower for NUU, 30% higher for CNPHR and 5% lower for CNNPH when compared with works in the literature. Other results are 10% higher for ATC, 75% lower for HBR and 40% lower for NCBR. In general, the proposed handover decision algorithm for energy efficient load balancing management in LTE has proven its ability for energy consumption optimization, load balancing management and pingpong handover control

    A neural network based approach for call admission control in heterogeneous networks

    Get PDF
    The next generation wireless networks will be based on infrastructure with the support of heterogeneous networks. In such a scenario, the users will be mobile between different networks; therefore the number of handovers that a user has to make will become greater. Thus, at a given instant, there will be great chance that a certain cell does not have capacity to sustain the need of users. This may result in great loss of calls and lead to poor quality of service. Moreover, in the future generation of wireless networks, end users will be able to connect any suitable network amongst available set of heterogeneous networks. This ability of an end user being connected to the network of their choice may also affect network load of various base stations. This necessitates for a suitable call admission control scheme for the implementation of heterogeneous networks in the future. Since the behavior of users arriving at any cell in heterogeneous network is unpredictable, we utilize neural network to model our heterogeneous network to admit network load, therefore the learned neural network is able to estimate when call should be admitted in a new situation. Results obtained indicate that neural network approach solves the problem of call admission control unforeseen real-time scenario. The neural network shows reduced error for the increased values of learning rate and momentum constant

    Handover management strategies in LTE-advanced heterogeneous networks.

    Get PDF
    Doctoral Degree. University of KwaZulu-Natal, Durban.Meeting the increasing demand for data due to the proliferation of high-specification mobile devices in the cellular systems has led to the improvement of the Long Term Evolution (LTE) framework to the LTE-Advanced systems. Different aspects such as Massive Multiple-Input Multiple Output (MIMO), Orthogonal Frequency Division Multiple Access (OFDMA), heterogeneous networks and Carrier Aggregation have been considered in the LTE-Advanced to improve the performance of the system. The small cells like the femtocells and the relays play a significant role in increasing the coverage and the capacity of the mobile cellular networks in LTE-Advanced (LTE-A) heterogeneous network. However, the user equipment (UE) are faced with the frequent handover problems in the heterogeneous systems than the homogeneous systems due to the users‟ mobility and densely populated cells. The objective of this research work is to analyse the handover performance in the current LTE/LTE-A network and to propose various handover management strategies to handle the frequent handover problems in the LTE-Advance heterogeneous networks. To achieve this, an event driven simulator using C# was developed based on the 3GPP LTE/LTE-A standard to evaluate the proposed strategies. To start with, admission control which is a major requirement during the handover initiation stage is discussed and this research work has therefore proposed a channel borrowing admission control scheme for the LTE-A networks. With this scheme in place, resources are better utilized and more calls are accepted than in the conventional schemes where the channel borrowing is not applied. Also proposed is an enhanced strategy for the handover management in two-tier femtocell-macrocell networks. The proposed strategy takes into consideration the speed of user and other parameters in other to effectively reduce the frequent and unnecessary handovers, and as well as the ratio of target femtocells in the system. We also consider scenarios such as the one that dominate the future networks where femtocells will be densely populated to handle very heavy traffic. To achieve this, a Call Admission Control (CAC)-based handover management strategy is proposed to manage the handover in dense femtocell-macrocell integration in the LTE-A network. The handover probability, the handover call dropping probability and the call blocking probability are reduced considerably with the proposed strategy. Finally, the handover management for the mobile relays in a moving vehicle is considered (using train as a case study). We propose a group handover strategy where the Mobile Relay Node (MRN) is integrated with a special mobile device called “mdev” to prepare the group information prior to the handover time. This is done to prepare the UE‟s group information and services for timely handover due to the speed of the train. This strategy reduces the number of handovers and the call dropping probability in the moving vehicle.Publications and conferences listed on page iv-v

    Near-Real Time, Semi-Automated Threat Assessment of Information Environments

    Get PDF
    Threat assessment is a crucial process for monitoring and defending against potential threats in an organization’s information environment and business operations. Ensuring the security of information infrastructure requires effective information security practices. However, existing models and methodologies often fall short of addressing the dynamic and evolving nature of cyberattacks. Moreover, critical threat intelligence extracted from the threat agents lacks the ability to capture essential attributes such as motivation, opportunity, and capability (M, O, C). This contribution to knowledge clarification introduces a semi-automatic threat assessment model that can handle situational awareness data or live acquired data stream from networks, incorporating information security techniques, protocols, and real-time monitoring of specific network types. Additionally, it focuses on analysing and implementing network traffic within a specific real-time information environment. To develop the semi-automatic threat assessment model, the study identifies unique attributes of threat agents by analysing Packet Capture Application Programming Interface (PCAP) files and data stream collected between 2012 and 2019. The study utilizes both hypothetical and real-world examples of threat agents to evaluate the three key factors: motivation, opportunity, and capability. This evaluation serves as a basis for designing threat profiles, critical threat intelligence, and assessing the complexity of process. These aspects are currently overlooked in existing threat agent taxonomies, models, and methodologies. By addressing the limitations of traditional threat assessment approaches, this research contributes to advancing the field of cybersecurity. The proposed semi-automatic threat assessment model offers improved awareness and timely detection of threats, providing organizations with a more robust defence against evolving cyberattacks. This research enhances the understanding of threat agents’ attributes and assists in developing proactive strategies to mitigate the risks associated with cybersecurity in the modern information environment

    Adaptive vehicular networking with Deep Learning

    Get PDF
    Vehicular networks have been identified as a key enabler for future smart traffic applications aiming to improve on-road safety, increase road traffic efficiency, or provide advanced infotainment services to improve on-board comfort. However, the requirements of smart traffic applications also place demands on vehicular networks’ quality in terms of high data rates, low latency, and reliability, while simultaneously meeting the challenges of sustainability, green network development goals and energy efficiency. The advances in vehicular communication technologies combined with the peculiar characteristics of vehicular networks have brought challenges to traditional networking solutions designed around fixed parameters using complex mathematical optimisation. These challenges necessitate greater intelligence to be embedded in vehicular networks to realise adaptive network optimisation. As such, one promising solution is the use of Machine Learning (ML) algorithms to extract hidden patterns from collected data thus formulating adaptive network optimisation solutions with strong generalisation capabilities. In this thesis, an overview of the underlying technologies, applications, and characteristics of vehicular networks is presented, followed by the motivation of using ML and a general introduction of ML background. Additionally, a literature review of ML applications in vehicular networks is also presented drawing on the state-of-the-art of ML technology adoption. Three key challenging research topics have been identified centred around network optimisation and ML deployment aspects. The first research question and contribution focus on mobile Handover (HO) optimisation as vehicles pass between base stations; a Deep Reinforcement Learning (DRL) handover algorithm is proposed and evaluated against the currently deployed method. Simulation results suggest that the proposed algorithm can guarantee optimal HO decision in a realistic simulation setup. The second contribution explores distributed radio resource management optimisation. Two versions of a Federated Learning (FL) enhanced DRL algorithm are proposed and evaluated against other state-of-the-art ML solutions. Simulation results suggest that the proposed solution outperformed other benchmarks in overall resource utilisation efficiency, especially in generalisation scenarios. The third contribution looks at energy efficiency optimisation on the network side considering a backdrop of sustainability and green networking. A cell switching algorithm was developed based on a Graph Neural Network (GNN) model and the proposed energy efficiency scheme is able to achieve almost 95% of the metric normalised energy efficiency compared against the “ideal” optimal energy efficiency benchmark and is capable of being applied in many more general network configurations compared with the state-of-the-art ML benchmark
    corecore