7,234 research outputs found

    Using Machine Learning for Handover Optimization in Vehicular Fog Computing

    Full text link
    Smart mobility management would be an important prerequisite for future fog computing systems. In this research, we propose a learning-based handover optimization for the Internet of Vehicles that would assist the smooth transition of device connections and offloaded tasks between fog nodes. To accomplish this, we make use of machine learning algorithms to learn from vehicle interactions with fog nodes. Our approach uses a three-layer feed-forward neural network to predict the correct fog node at a given location and time with 99.2 % accuracy on a test set. We also implement a dual stacked recurrent neural network (RNN) with long short-term memory (LSTM) cells capable of learning the latency, or cost, associated with these service requests. We create a simulation in JAMScript using a dataset of real-world vehicle movements to create a dataset to train these networks. We further propose the use of this predictive system in a smarter request routing mechanism to minimize the service interruption during handovers between fog nodes and to anticipate areas of low coverage through a series of experiments and test the models' performance on a test set

    VIRTUALIZED BASEBAND UNITS CONSOLIDATION IN ADVANCED LTE NETWORKS USING MOBILITY- AND POWER-AWARE ALGORITHMS

    Get PDF
    Virtualization of baseband units in Advanced Long-Term Evolution networks and a rapid performance growth of general purpose processors naturally raise the interest in resource multiplexing. The concept of resource sharing and management between virtualized instances is not new and extensively used in data centers. We adopt some of the resource management techniques to organize virtualized baseband units on a pool of hosts and investigate the behavior of the system in order to identify features which are particularly relevant to mobile environment. Subsequently, we introduce our own resource management algorithm specifically targeted to address some of the peculiarities identified by experimental results

    Improved Handover Through Dual Connectivity in 5G mmWave Mobile Networks

    Full text link
    The millimeter wave (mmWave) bands offer the possibility of orders of magnitude greater throughput for fifth generation (5G) cellular systems. However, since mmWave signals are highly susceptible to blockage, channel quality on any one mmWave link can be extremely intermittent. This paper implements a novel dual connectivity protocol that enables mobile user equipment (UE) devices to maintain physical layer connections to 4G and 5G cells simultaneously. A novel uplink control signaling system combined with a local coordinator enables rapid path switching in the event of failures on any one link. This paper provides the first comprehensive end-to-end evaluation of handover mechanisms in mmWave cellular systems. The simulation framework includes detailed measurement-based channel models to realistically capture spatial dynamics of blocking events, as well as the full details of MAC, RLC and transport protocols. Compared to conventional handover mechanisms, the study reveals significant benefits of the proposed method under several metrics.Comment: 16 pages, 13 figures, to appear on the 2017 IEEE JSAC Special Issue on Millimeter Wave Communications for Future Mobile Network

    On the Minimization of Handover Decision Instability in Wireless Local Area Networks

    Full text link
    This paper addresses handover decision instability which impacts negatively on both user perception and network performances. To this aim, a new technique called The HandOver Decision STAbility Technique (HODSTAT) is proposed for horizontal handover in Wireless Local Area Networks (WLAN) based on IEEE 802.11standard. HODSTAT is based on a hysteresis margin analysis that, combined with a utilitybased function, evaluates the need for the handover and determines if the handover is needed or avoided. Indeed, if a Mobile Terminal (MT) only transiently hands over to a better network, the gain from using this new network may be diminished by the handover overhead and short usage duration. The approach that we adopt throughout this article aims at reducing the minimum handover occurrence that leads to the interruption of network connectivity (this is due to the nature of handover in WLAN which is a break before make which causes additional delay and packet loss). To this end, MT rather performs a handover only if the connectivity of the current network is threatened or if the performance of a neighboring network is really better comparing the current one with a hysteresis margin. This hysteresis should make a tradeoff between handover occurrence and the necessity to change the current network of attachment. Our extensive simulation results show that our proposed algorithm outperforms other decision stability approaches for handover decision algorithm.Comment: 13 Pages, IJWM

    A Machine Learning based Framework for KPI Maximization in Emerging Networks using Mobility Parameters

    Full text link
    Current LTE network is faced with a plethora of Configuration and Optimization Parameters (COPs), both hard and soft, that are adjusted manually to manage the network and provide better Quality of Experience (QoE). With 5G in view, the number of these COPs are expected to reach 2000 per site, making their manual tuning for finding the optimal combination of these parameters, an impossible fleet. Alongside these thousands of COPs is the anticipated network densification in emerging networks which exacerbates the burden of the network operators in managing and optimizing the network. Hence, we propose a machine learning-based framework combined with a heuristic technique to discover the optimal combination of two pertinent COPs used in mobility, Cell Individual Offset (CIO) and Handover Margin (HOM), that maximizes a specific Key Performance Indicator (KPI) such as mean Signal to Interference and Noise Ratio (SINR) of all the connected users. The first part of the framework leverages the power of machine learning to predict the KPI of interest given several different combinations of CIO and HOM. The resulting predictions are then fed into Genetic Algorithm (GA) which searches for the best combination of the two mentioned parameters that yield the maximum mean SINR for all users. Performance of the framework is also evaluated using several machine learning techniques, with CatBoost algorithm yielding the best prediction performance. Meanwhile, GA is able to reveal the optimal parameter setting combination more efficiently and with three orders of magnitude faster convergence time in comparison to brute force approach
    • …
    corecore