1,560 research outputs found

    An Intelligent Mobility Prediction Scheme for Location-Based Service over Cellular Communications Network

    Get PDF
    One of the trickiest challenges introduced by cellular communications networks is mobility prediction for Location Based-Services (LBSs). Hence, an accurate and efficient mobility prediction technique is particularly needed for these networks. The mobility prediction technique incurs overheads on the transmission process. These overheads affect properties of the cellular communications network such as delay, denial of services, manual filtering and bandwidth. The main goal of this research is to enhance a mobility prediction scheme in cellular communications networks through three phases. Firstly, current mobility prediction techniques will be investigated. Secondly, innovation and examination of new mobility prediction techniques will be based on three hypothesises that are suitable for cellular communications network and mobile user (MU) resources with low computation cost and high prediction success rate without using MU resources in the prediction process. Thirdly, a new mobility prediction scheme will be generated that is based on different levels of mobility prediction. In this thesis, a new mobility prediction scheme for LBSs is proposed. It could be considered as a combination of the cell and routing area (RA) prediction levels. For cell level prediction, most of the current location prediction research is focused on generalized location models, where the geographic extent is divided into regular-shape cells. These models are not suitable for certain LBSs where the objectives are to compute and present on-road services. Such techniques are the New Markov-Based Mobility Prediction (NMMP) and Prediction Location Model (PLM) that deal with inner cell structure and different levels of prediction, respectively. The NMMP and PLM techniques suffer from complex computation, accuracy rate regression and insufficient accuracy. In this thesis, Location Prediction based on a Sector Snapshot (LPSS) is introduced, which is based on a Novel Cell Splitting Algorithm (NCPA). This algorithm is implemented in a micro cell in parallel with the new prediction technique. The LPSS technique, compared with two classic prediction techniques and the experimental results, shows the effectiveness and robustness of the new splitting algorithm and prediction technique. In the cell side, the proposed approach reduces the complexity cost and prevents the cell level prediction technique from performing in time slots that are too close. For these reasons, the RA avoids cell-side problems. This research discusses a New Routing Area Displacement Prediction for Location-Based Services (NRADP) which is based on developed Ant Colony Optimization (ACO). The NRADP, compared with Mobility Prediction based on an Ant System (MPAS) and the experimental results, shows the effectiveness, higher prediction rate, reduced search stagnation ratio, and reduced computation cost of the new prediction technique

    Regressive Prediction Approach to Vertical Handover in Fourth Generation Wireless Networks

    Get PDF
    The over increasing demand for deployment of wireless access networks has made wireless mobile devices to face so many challenges in choosing the best suitable network from a set of available access networks. Some of the weighty issues in 4G wireless networks are fastness and seamlessness in handover process. This paper therefore, proposes a handover technique based on movement prediction in wireless mobile (WiMAX and LTE-A) environment. The technique enables the system to predict signal quality between the UE and Radio Base Stations (RBS)/Access Points (APs) in two different networks. Prediction is achieved by employing the Markov Decision Process Model (MDPM) where the movement of the UE is dynamically estimated and averaged to keep track of the signal strength of mobile users. With the help of the prediction, layer-3 handover activities are able to occur prior to layer-2 handover, and therefore, total handover latency can be reduced. The performances of various handover approaches influenced by different metrics (mobility velocities) were evaluated. The results presented demonstrate good accuracy the proposed method was able to achieve in predicting the next signal level by reducing the total handover latency

    Hybrid Satellite-Terrestrial Communication Networks for the Maritime Internet of Things: Key Technologies, Opportunities, and Challenges

    Get PDF
    With the rapid development of marine activities, there has been an increasing number of maritime mobile terminals, as well as a growing demand for high-speed and ultra-reliable maritime communications to keep them connected. Traditionally, the maritime Internet of Things (IoT) is enabled by maritime satellites. However, satellites are seriously restricted by their high latency and relatively low data rate. As an alternative, shore & island-based base stations (BSs) can be built to extend the coverage of terrestrial networks using fourth-generation (4G), fifth-generation (5G), and beyond 5G services. Unmanned aerial vehicles can also be exploited to serve as aerial maritime BSs. Despite of all these approaches, there are still open issues for an efficient maritime communication network (MCN). For example, due to the complicated electromagnetic propagation environment, the limited geometrically available BS sites, and rigorous service demands from mission-critical applications, conventional communication and networking theories and methods should be tailored for maritime scenarios. Towards this end, we provide a survey on the demand for maritime communications, the state-of-the-art MCNs, and key technologies for enhancing transmission efficiency, extending network coverage, and provisioning maritime-specific services. Future challenges in developing an environment-aware, service-driven, and integrated satellite-air-ground MCN to be smart enough to utilize external auxiliary information, e.g., sea state and atmosphere conditions, are also discussed

    Radio resource scheduling in homogeneous coordinated multi-point joint transmission of future mobile networks

    Get PDF
    A thesis submitted to the University of Bedfordshire, in partial fulfilment of the requirements for the degree of Doctor of Philosophy (PhD)The demand of mobile users with high data-rate services continues to increase. To satisfy the needs of such mobile users, operators must continue to enhance their existing networks. The radio interface is a well-known bottleneck because the radio spectrum is limited and therefore expensive. Efficient use of the radio spectrum is, therefore, very important. To utilise the spectrum efficiently, any of the channels can be used simultaneously in any of the cells as long as interference generated by the base stations using the same channels is below an acceptable level. In cellular networks based on Orthogonal Frequency Division Multiple Access (OFDMA), inter-cell interference reduces the performance of the link throughput to users close to the cell edge. To improve the performance of cell-edge users, a technique called Coordinated Multi-Point (CoMP) transmission is being researched for use in the next generation of cellular networks. For a network to benefit from CoMP, its utilisation of resources should be scheduled efficiently. The thesis focuses on the resource scheduling algorithm development for CoMP joint transmission scheme in OFDMA-based cellular networks. In addition to the algorithm, the thesis provides an analytical framework for the performance evaluation of the CoMP technique. From the system level simulation results, it has been shown that the proposed resource scheduling based on a joint maximum throughput provides higher spectral efficiency compared with a joint proportional fairness scheduling algorithm under different traffic loads in the network and under different criteria of making cell-edge decision. A hybrid model combining the analytical and simulation approaches has been developed to evaluate the average system throughput. It has been found that the results of the hybrid model are in line with the simulation based results. The benefit of the model is that the throughput of any possible call state in the system can be evaluated. Two empirical path loss models in an indoor-to-outdoor environment of a residential area have been developed based on the measurement data at carrier frequencies 900 MHz and 2 GHz. The models can be used as analytical expressions to estimate the level of interference by a femtocell to a macrocell user in link-level simulations

    Device-to-Device Communication and Multihop Transmission for Future Cellular Networks

    Get PDF
    The next generation wireless networks i.e. 5G aim to provide multi-Gbps data traffic, in order to satisfy the increasing demand for high-definition video, among other high data rate services, as well as the exponential growth in mobile subscribers. To achieve this dramatic increase in data rates, current research is focused on improving the capacity of current 4G network standards, based on Long Term Evolution (LTE), before radical changes are exploited which could include acquiring additional/new spectrum. The LTE network has a reuse factor of one; hence neighbouring cells/sectors use the same spectrum, therefore making the cell edge users vulnerable to inter-cell interference. In addition, wireless transmission is commonly hindered by fading and pathloss. In this direction, this thesis focuses on improving the performance of cell edge users in LTE and LTE-Advanced (LTE-A) networks by initially implementing a new Coordinated Multi-Point (CoMP) algorithm to mitigate cell edge user interference. Subsequently Device-to-Device (D2D) communication is investigated as the enabling technology for maximising Resource Block (RB) utilisation in current 4G and emerging 5G networks. It is demonstrated that the application, as an extension to the above, of novel power control algorithms, to reduce the required D2D TX power, and multihop transmission for relaying D2D traffic, can further enhance network performance. To be able to develop the aforementioned technologies and evaluate the performance of new algorithms in emerging network scenarios, a beyond-the-state-of-the-art LTE system-level simulator (SLS) was implemented. The new simulator includes Multiple-Input Multiple-Output (MIMO) antenna functionalities, comprehensive channel models (such as Wireless World initiative New Radio II i.e. WINNER II) and adaptive modulation and coding schemes to accurately emulate the LTE and LTE-A network standards. Additionally, a novel interference modelling scheme using the ‘wrap around’ technique was proposed and implemented that maintained the topology of flat surfaced maps, allowing for use with cell planning tools while obtaining accurate and timely results in the SLS compared to the few existing platforms. For the proposed CoMP algorithm, the adaptive beamforming technique was employed to reduce interference on the cell edge UEs by applying Coordinated Scheduling (CoSH) between cooperating cells. Simulation results show up to 2-fold improvement in terms of throughput, and also shows SINR gain for the cell edge UEs in the cooperating cells. Furthermore, D2D communication underlaying the LTE network (and future generation of wireless networks) was investigated. The technology exploits the proximity of users in a network to achieve higher data rates with maximum RB utilisation (as the technology reuses the cellular RB simultaneously), while taking some load off the Evolved Node B (eNB) i.e. by direct communication between User Equipment (UE). Simulation results show that the proximity and transmission power of D2D transmission yields high performance gains for a D2D receiver, which was demonstrated to be better than that of cellular UEs with better channel conditions or in close proximity to the eNB in the network. The impact of interference from the simultaneous transmission however impedes the achievable data rates of cellular UEs in the network, especially at the cell edge. Thus, a power control algorithm was proposed to mitigate the impact of interference in the hybrid network (network consisting of both cellular and D2D UEs). It was implemented by setting a minimum SINR threshold so that the cellular UEs achieve a minimum performance, and equally a maximum SINR threshold to establish fairness for the D2D transmission as well. Simulation results show an increase in the cell edge throughput and notable improvement in the overall SINR distribution of UEs in the hybrid network. Additionally, multihop transmission for D2D UEs was investigated in the hybrid network: traditionally, the scheme is implemented to relay cellular traffic in a homogenous network. Contrary to most current studies where D2D UEs are employed to relay cellular traffic, the use of idle nodes to relay D2D traffic was implemented uniquely in this thesis. Simulation results show improvement in D2D receiver throughput with multihop transmission, which was significantly better than that of the same UEs performance with equivalent distance between the D2D pair when using single hop transmission
    corecore