4 research outputs found

    Introducing a Novel Minimum Accuracy Concept for Predictive Mobility Management Schemes

    Get PDF
    In this paper, an analytical model for the minimum required accuracy for predictive methods is derived in terms of both handover (HO) delay and HO signaling cost. After that, the total HO delay and signaling costs are derived for the worst-case scenario (when the predictive process has the same performance as the conventional one), and simulations are conducted using a cellular environment to reveal the importance of the proposed minimum accuracy framework. In addition to this, three different predictors; Markov Chains, Artificial Neural Network (ANN) and an Improved ANN (IANN) are implemented and compared. The results indicate that under certain circumstances, the predictors can occasionally fall below the applicable level. Therefore, the proposed concept of minimum accuracy plays a vital role in determining this corresponding threshold

    Handover Management in Dense Networks with Coverage Prediction from Sparse Networks

    Get PDF
    Millimeter Wave (mm-Wave) provides high bandwidth and is expected to increase the capacity of the network thousand-fold in the future generations of mobile communications. However, since mm-Wave is sensitive to blockage and incurs in a high penetration loss, it has increased complexity and bottleneck in the realization of substantial gain. Network densification, as a solution for sensitivity and blockage, increases handover (HO) rate, unnecessary and ping-pong HO’s, which in turn reduces the throughput of the network. On the other hand, to minimize the effect of increased HO rate, Time to Trigger (TTT) and Hysteresis factor (H) have been used in Long Term Evolution (LTE). In this paper, we primarily present two different networks based on Evolved NodeB (eNB) density: sparse and dense. As their name also suggests, the eNB density in the dense network is higher than the sparse network. Hence, we proposed an optimal eNB selection mechanism for 5G intra-mobility HO based on spatial information of the sparse eNB network. In this approach, User Equipment (UE) in the dense network is connected only to a few selected eNBs, which are delivered from the sparse network, in the first place. HO event occurs only when the serving eNB can no longer satisfy the minimum Signal-to-Noise Ratio (SNR) threshold. For the eNBs, which are deployed in the dense network, follow the conventional HO procedure. Results reveal that the HO rate is decreased significantly with the proposed approach for the TTT values between 0 ms to 256 ms while keeping the radio link failure (RLF) at an acceptable level; less than 2% for the TTT values between 0 ms to 160 ms. This study paves a way for HO management in the future 5G network

    Novel QoS-aware proactive spectrum access techniques for cognitive radio using machine learning

    Get PDF
    Traditional cognitive radio (CR) spectrum access techniques have been primitive and inefficient due to being blind to the occupancy conditions of the spectrum bands to be sensed. In addition, current spectrum access techniques are also unable to detect network changes or even consider the requirements of unlicensed users, leading to a poorer quality of service (QoS) and excessive latency. As user-specific approaches will play a key role in future wireless communication networks, the conventional CR spectrum access should also be updated in order to be more effective and agile. In this paper, a comprehensive and novel solution is proposed to decrease the sensing latency and to make the CR networks (CRNs) aware of unlicensed user requirements. As such, a proactive process with a novel QoS-based optimization phase is proposed, consisting of two different decision strategies. Initially, future traffic loads of the different radio access technologies (RATs), occupying different bands of the spectrum, are predicted using the artificial neural networks (ANNs). Based on these predictions, two strategies are proposed. In the first one, which solely focuses on latency, a virtual wideband (WB) sensing approach is developed, where predicted relative traffic loads in WB are exploited to enable narrowband (NB) sensing. The second one, based on Q -learning, focuses not only on minimizing the sensing latency but also on satisfying other user requirements. The results reveal that the first strategy manages to significantly reduce the sensing latency of the random selection process by 59.6%, while the Q -learning assisted second strategy enhanced the full-satisfaction by up to 95.7%

    The role of artificial intelligence driven 5G networks in COVID-19 outbreak: opportunities, challenges, and future outlook

    Get PDF
    There is no doubt that the world is currently experiencing a global pandemic that is reshaping our daily lives as well as the way business activities are being conducted. With the emphasis on social distancing as an effective means of curbing the rapid spread of the infection, many individuals, institutions, and industries have had to rely on telecommunications as a means of ensuring service continuity in order to prevent complete shutdown of their operations. This has put enormous pressure on both fixed and mobile networks. Though fifth generation mobile networks (5G) is at its infancy in terms of deployment, it possesses a broad category of services including enhanced mobile broadband (eMBB), ultra-reliable low-latency communications (URLLC), and massive machine-type communications (mMTC), that can help in tackling pandemic-related challenges. Therefore, in this paper, we identify the challenges facing existing networks due to the surge in traffic demand as a result of the COVID-19 pandemic and emphasize the role of 5G empowered by artificial intelligence in tackling these problems. In addition, we also provide a brief insight on the use of artificial intelligence driven 5G networks in predicting future pandemic outbreaks, and the development a pandemic-resilient society in case of future outbreaks
    corecore