213 research outputs found

    Probabilistic approaches to the design of wireless ad hoc and sensor networks

    Get PDF
    The emerging wireless technologies has made ubiquitous wireless access a reality and enabled wireless systems to support a large variety of applications. Since the wireless self-configuring networks do not require infrastructure and promise greater flexibility and better coverage, wireless ad hoc and sensor networks have been under intensive research. It is believed that wireless ad hoc and sensor networks can become as important as the Internet. Just as the Internet allows access to digital information anywhere, ad hoc and sensor networks will provide remote interaction with the physical world. Dynamics of the object distribution is one of the most important features of the wireless ad hoc and sensor networks. This dissertation deals with several interesting estimation and optimization problems on the dynamical features of ad hoc and sensor networks. Many demands in application, such as reliability, power efficiency and sensor deployment, of wireless ad hoc and sensor network can be improved by mobility estimation and/or prediction. In this dissertation, we study several random mobility models, present a mobility prediction methodology, which relies on the analysis of the moving patterns of the mobile objects. Through estimating the future movement of objects and analyzing the tradeoff between the estimation cost and the quality of reliability, the optimization of tracking interval for sensor networks is presented. Based on the observation on the location and movement of objects, an optimal sensor placement algorithm is proposed by adaptively learn the dynamical object distribution. Moreover, dynamical boundary of mass objects monitored in a sensor network can be estimated based on the unsupervised learning of the distribution density of objects. In order to provide an accurate estimation of mobile objects, we first study several popular mobility models. Based on these models, we present some mobility prediction algorithms accordingly, which are capable of predicting the moving trajectory of objects in the future. In wireless self-configuring networks, an accurate estimation algorithm allows for improving the link reliability, power efficiency, reducing the traffic delay and optimizing the sensor deployment. The effects of estimation accuracy on the reliability and the power consumption have been studied and analyzed. A new methodology is proposed to optimize the reliability and power efficiency by balancing the trade-off between the quality of performance and estimation cost. By estimating and predicting the mass objects\u27 location and movement, the proposed sensor placement algorithm demonstrates a siguificant improvement on the detection of mass objects with nearmaximal detection accuracy. Quantitative analysis on the effects of mobility estimation and prediction on the accuracy of detection by sensor networks can be conducted with recursive EM algorithms. The future work includes the deployment of the proposed concepts and algorithms into real-world ad hoc and sensor networks

    User-oriented mobility management in cellular wireless networks

    Get PDF
    2020 Spring.Includes bibliographical references.Mobility Management (MM) in wireless mobile networks is a vital process to keep an individual User Equipment (UE) connected while moving within the network coverage area—this is required to keep the network informed about the UE's mobility (i.e., location changes). The network must identify the exact serving cell of a specific UE for the purpose of data-packet delivery. The two MM procedures that are necessary to localize a specific UE and deliver data packets to that UE are known as Tracking Area Update (TAU) and Paging, which are burdensome not only to the network resources but also UE's battery—the UE and network always initiate the TAU and Paging, respectively. These two procedures are used in current Long Term Evolution (LTE) and its next generation (5G) networks despite the drawback that it consumes bandwidth and energy. Because of potentially very high-volume traffic and increasing density of high-mobility UEs, the TAU/Paging procedure incurs significant costs in terms of the signaling overhead and the power consumption in the battery-limited UE. This problem will become even worse in 5G, which is expected to accommodate exceptional services, such as supporting mission-critical systems (close-to-zero latency) and extending battery lifetime (10 times longer). This dissertation examines and discusses a variety of solution schemes for both the TAU and Paging, emphasizing a new key design to accommodate 5G use cases. However, ongoing efforts are still developing new schemes to provide seamless connections to the ever-increasing density of high-mobility UEs. In this context and toward achieving 5G use cases, we propose a novel solution to solve the MM issues, named gNB-based UE Mobility Tracking (gNB-based UeMT). This solution has four features aligned with achieving 5G goals. First, the mobile UE will no longer trigger the TAU to report their location changes, giving much more power savings with no signaling overhead. Instead, second, the network elements, gNBs, take over the responsibility of Tracking and Locating these UE, giving always-known UE locations. Third, our Paging procedure is markedly improved over the conventional one, providing very fast UE reachability with no Paging messages being sent simultaneously. Fourth, our solution guarantees lightweight signaling overhead with very low Paging delay; our simulation studies show that it achieves about 92% reduction in the corresponding signaling overhead. To realize these four features, this solution adds no implementation complexity. Instead, it exploits the already existing LTE/5G communication protocols, functions, and measurement reports. Our gNB-based UeMT solution by design has the potential to deal with mission-critical applications. In this context, we introduce a new approach for mission-critical and public-safety communications. Our approach aims at emergency situations (e.g., natural disasters) in which the mobile wireless network becomes dysfunctional, partially or completely. Specifically, this approach is intended to provide swift network recovery for Search-and-Rescue Operations (SAROs) to search for survivors after large-scale disasters, which we call UE-based SAROs. These SAROs are based on the fact that increasingly almost everyone carries wireless mobile devices (UEs), which serve as human-based wireless sensors on the ground. Our UE-based SAROs are aimed at accounting for limited UE battery power while providing critical information to first responders, as follows: 1) generate immediate crisis maps for the disaster-impacted areas, 2) provide vital information about where the majority of survivors are clustered/crowded, and 3) prioritize the impacted areas to identify regions that urgently need communication coverage. UE-based SAROs offer first responders a vital tool to prioritize and manage SAROs efficiently and effectively in a timely manner

    Investigation of an intelligent personalised service recommendation system in an IMS based cellular mobile network

    Get PDF
    Success or failure of future information and communication services in general and mobile communications in particular is greatly dependent on the level of personalisations they can offer. While the provision of anytime, anywhere, anyhow services has been the focus of wireless telecommunications in recent years, personalisation however has gained more and more attention as the unique selling point of mobile devices. Smart phones should be intelligent enough to match user’s unique needs and preferences to provide a truly personalised service tailored for the individual user. In the first part of this thesis, the importance and role of personalisation in future mobile networks is studied. This is followed, by an agent based futuristic user scenario that addresses the provision of rich data services independent of location. Scenario analysis identifies the requirements and challenges to be solved for the realisation of a personalised service. An architecture based on IP Multimedia Subsystem is proposed for mobility and to provide service continuity whilst roaming between two different access standards. Another aspect of personalisation, which is user preference modelling, is investigated in the context of service selection in a multi 3rd party service provider environment. A model is proposed for the automatic acquisition of user preferences to assist in service selection decision-making. User preferences are modelled based on a two-level Bayesian Metanetwork. Personal agents incorporating the proposed model provide answers to preference related queries such as cost, QoS and service provider reputation. This allows users to have their preferences considered automatically

    A cell outage management framework for dense heterogeneous networks

    Get PDF
    In this paper, we present a novel cell outage management (COM) framework for heterogeneous networks with split control and data planes-a candidate architecture for meeting future capacity, quality-of-service, and energy efficiency demands. In such an architecture, the control and data functionalities are not necessarily handled by the same node. The control base stations (BSs) manage the transmission of control information and user equipment (UE) mobility, whereas the data BSs handle UE data. An implication of this split architecture is that an outage to a BS in one plane has to be compensated by other BSs in the same plane. Our COM framework addresses this challenge by incorporating two distinct cell outage detection (COD) algorithms to cope with the idiosyncrasies of both data and control planes. The COD algorithm for control cells leverages the relatively larger number of UEs in the control cell to gather large-scale minimization-of-drive-test report data and detects an outage by applying machine learning and anomaly detection techniques. To improve outage detection accuracy, we also investigate and compare the performance of two anomaly-detecting algorithms, i.e., k-nearest-neighbor- and local-outlier-factor-based anomaly detectors, within the control COD. On the other hand, for data cell COD, we propose a heuristic Grey-prediction-based approach, which can work with the small number of UE in the data cell, by exploiting the fact that the control BS manages UE-data BS connectivity and by receiving a periodic update of the received signal reference power statistic between the UEs and data BSs in its coverage. The detection accuracy of the heuristic data COD algorithm is further improved by exploiting the Fourier series of the residual error that is inherent to a Grey prediction model. Our COM framework integrates these two COD algorithms with a cell outage compensation (COC) algorithm that can be applied to both planes. Our COC solution utilizes an actor-critic-based reinforcement learning algorithm, which optimizes the capacity and coverage of the identified outage zone in a plane, by adjusting the antenna gain and transmission power of the surrounding BSs in that plane. The simulation results show that the proposed framework can detect both data and control cell outage and compensate for the detected outage in a reliable manner

    Towards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions

    Get PDF
    The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as eMBB, mMTC and URLLC, mMTC brings the unique technical challenge of supporting a huge number of MTC devices, which is the main focus of this paper. The related challenges include QoS provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and NB-IoT. Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenarios. Finally, we discuss some open research challenges and promising future research directions.Comment: 37 pages, 8 figures, 7 tables, submitted for a possible future publication in IEEE Communications Surveys and Tutorial

    Combinatorics-based energy conservation methods in wireless sensor networks

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    An Intelligent Mobility Prediction Scheme for Location-Based Service over Cellular Communications Network

    Get PDF
    One of the trickiest challenges introduced by cellular communications networks is mobility prediction for Location Based-Services (LBSs). Hence, an accurate and efficient mobility prediction technique is particularly needed for these networks. The mobility prediction technique incurs overheads on the transmission process. These overheads affect properties of the cellular communications network such as delay, denial of services, manual filtering and bandwidth. The main goal of this research is to enhance a mobility prediction scheme in cellular communications networks through three phases. Firstly, current mobility prediction techniques will be investigated. Secondly, innovation and examination of new mobility prediction techniques will be based on three hypothesises that are suitable for cellular communications network and mobile user (MU) resources with low computation cost and high prediction success rate without using MU resources in the prediction process. Thirdly, a new mobility prediction scheme will be generated that is based on different levels of mobility prediction. In this thesis, a new mobility prediction scheme for LBSs is proposed. It could be considered as a combination of the cell and routing area (RA) prediction levels. For cell level prediction, most of the current location prediction research is focused on generalized location models, where the geographic extent is divided into regular-shape cells. These models are not suitable for certain LBSs where the objectives are to compute and present on-road services. Such techniques are the New Markov-Based Mobility Prediction (NMMP) and Prediction Location Model (PLM) that deal with inner cell structure and different levels of prediction, respectively. The NMMP and PLM techniques suffer from complex computation, accuracy rate regression and insufficient accuracy. In this thesis, Location Prediction based on a Sector Snapshot (LPSS) is introduced, which is based on a Novel Cell Splitting Algorithm (NCPA). This algorithm is implemented in a micro cell in parallel with the new prediction technique. The LPSS technique, compared with two classic prediction techniques and the experimental results, shows the effectiveness and robustness of the new splitting algorithm and prediction technique. In the cell side, the proposed approach reduces the complexity cost and prevents the cell level prediction technique from performing in time slots that are too close. For these reasons, the RA avoids cell-side problems. This research discusses a New Routing Area Displacement Prediction for Location-Based Services (NRADP) which is based on developed Ant Colony Optimization (ACO). The NRADP, compared with Mobility Prediction based on an Ant System (MPAS) and the experimental results, shows the effectiveness, higher prediction rate, reduced search stagnation ratio, and reduced computation cost of the new prediction technique

    Seamless coverage for the next generation wireless communication networks

    Get PDF
    Data demand has exponentially increased due to the rapid growth of wireless and mobile devices traffic in recent years. With the advent of the fifth generation, 5G, and beyond networks, users will be able to take advantage of additional services beyond the capability of current wireless networks while maintaining a highquality experience. The exploitation of millimeter-wave (mm-wave) frequency in 5G promises to meet the demands of future networks with the motto of providing high data rate coverage with low latency to its users, which will allow future networks to function more efficiently. However, while planning a network using mm-wave frequencies, it is important to consider their small coverage footprints and weak penetration resistance. Heterogeneous network planning with the dense deployment of the small cells is one way of overcoming these issues, yet, without proper planning of the integrated network within the same or different frequencies could lead to other problems such as coverage gaps and frequent handovers; due to the natural physics of mm-wave frequencies. Therefore this thesis focuses on bringing ultra-reliable low-latency communication for mm-wave indoor users by increasing the indoor coverage and reducing the frequency of handovers. Towards achieving this thesis’s aim, a detailed literature review of mm-wave coverage is provided in Chapter 2. Moreover, a table that highlights the penetration loss of materials at various frequencies is provided as a result of thorough research in this field, which will be helpful to the researchers investigating this subject. According to our knowledge, this is the first table presenting the most studies that have been conducted in this field. Chapter 3 examines the interference effect of the outdoor base station (BS) inside the building in the context of a heterogeneous network environment. A single building model scenario is created, and the interference analysis is performed to observe the effects of different building materials used as walls. The results reveal the importance of choosing the material type when outdoor BS is close to the building. Moreover, the interference effect of outdoor BS should be minimized when the frequency re-use technique is deployed over very short distances. Chapter 4 presents two-fold contributions, in addition to providing a comprehensive handover study of mm-wave technology. The first study starts with addressing the problem of modelling users’ movement in the indoor environment. Therefore, a user-based indoor mobility prediction via Markov chain with an initial transition matrix is proposed, acquired from Q-learning algorithms. Based on the acquired knowledge of the user’s mobility in the indoor environment, the second contribution of this chapter provides a pre-emptive handover algorithm to provide seamless connection while the user moves within the heterogeneous network. The implementation and evaluation of the proposed algorithm show a reduction in the handover signalling costs by more than 50%, outperforming conventional handover algorithms. Lastly, Chapter 5 contributes to providing robust signal coverage for coverage blind areas and implementing and evaluating the proposed handover algorithm with the intelligent reflective surface. The results show a reduction in the handover signalling costs by more than 33%, outperforming conventional handover algorithms with the pre-emptive handover initiation

    Traffic pattern prediction in cellular networks.

    Get PDF
    PhDIncreasing numbers of users together with a more use of high bit-rate services complicate radio resource management in 3G systems. In order to improve the system capacity and guarantee the QoS, a large amount of research had been carried out on radio resource management. One viable approach reported is to use semi-smart antennas to dynamically change the radiation pattern of target cells to reduce congestion. One key factor of the semi-smart antenna techniques is the algorithm to adjust the beam pattern to cooperatively control the size and shape of each radio cell. Methods described in the literature determine the optimum radiation patterns according to the current observed congestion. By using machine learning methods, it is possible to detect the upcoming change of the traffic patterns at an early stage and then carry out beamforming optimization to alleviate the reduction in network performance. Inspired from the research carried out in the vehicle mobility prediction field, this work learns the movement patterns of mobile users with three different learning models by analysing the movement patterns captured locally. Three different mobility models are introduced to mimic the real-life movement of mobile users and provide analysable data for learning. The simulation results shows that the error rates of predictions on the geographic distribution of mobile users are low and it is feasible to use the proposed learning models to predict future traffic patterns. Being able to predict these patterns mean that the optimized beam patterns could be calculated according to the predicted traffic patterns and loaded to the relevant base stations in advance
    corecore