504 research outputs found

    A survey of online data-driven proactive 5G network optimisation using machine learning

    Get PDF
    In the fifth-generation (5G) mobile networks, proactive network optimisation plays an important role in meeting the exponential traffic growth, more stringent service requirements, and to reduce capitaland operational expenditure. Proactive network optimisation is widely acknowledged as on e of the most promising ways to transform the 5G network based on big data analysis and cloud-fog-edge computing, but there are many challenges. Proactive algorithms will require accurate forecasting of highly contextualised traffic demand and quantifying the uncertainty to drive decision making with performance guarantees. Context in Cyber-Physical-Social Systems (CPSS) is often challenging to uncover, unfolds over time, and even more difficult to quantify and integrate into decision making. The first part of the review focuses on mining and inferring CPSS context from heterogeneous data sources, such as online user-generated-content. It will examine the state-of-the-art methods currently employed to infer location, social behaviour, and traffic demand through a cloud-edge computing framework; combining them to form the input to proactive algorithms. The second part of the review focuses on exploiting and integrating the demand knowledge for a range of proactive optimisation techniques, including the key aspects of load balancing, mobile edge caching, and interference management. In both parts, appropriate state-of-the-art machine learning techniques (including probabilistic uncertainty cascades in proactive optimisation), complexity-performance trade-offs, and demonstrative examples are presented to inspire readers. This survey couples the potential of online big data analytics, cloud-edge computing, statistical machine learning, and proactive network optimisation in a common cross-layer wireless framework. The wider impact of this survey includes better cross-fertilising the academic fields of data analytics, mobile edge computing, AI, CPSS, and wireless communications, as well as informing the industry of the promising potentials in this area

    Spatio-temporal crime predictions by leveraging artificial intelligence for citizens security in smart cities

    Get PDF
    Smart city infrastructure has a significant impact on improving the quality of humans life. However, a substantial increase in the urban population from the last few years poses challenges related to resource management, safety, and security. To ensure the safety and security in the smart city environment, this paper presents a novel approach by empowering the authorities to better visualize the threats, by identifying and predicting the highly-reported crime zones in the smart city. To this end, it first investigates the Hierarchical Density-Based Spatial Clustering of Applications with Noise (HDBSCAN) to detect the hot-spots that have a higher risk of crime occurrence. Second, for crime prediction, Seasonal Auto-Regressive Integrated Moving Average (SARIMA) is exploited in each dense crime region to predict the number of crime incidents in the future with spatial and temporal information. The proposed HDBSCAN and SARIMA based crime prediction model is evaluated on ten years of crime data (2008-2017) for New York City (NYC) . The accuracy of the model is measured by considering different time scenarios such as the year-wise, (i.e., for each year), and for the total considered duration of ten years using an 80:20 ratio. The 80% of data was used for training and 20% for testing. The proposed approach outperforms with an average Mean Absolute Error (MAE) of 11.47 as compared to the highest scoring DBSCAN based method with MAE 27.03

    An intelligent framework using disruptive technologies for COVID-19 analysis

    Get PDF
    This paper describes a framework using disruptive technologies for COVID-19 analysis. Disruptive technologies include high-tech and emerging technologies such as AI, industry 4.0, IoT, Internet of Medical Things (IoMT), big data, virtual reality (VR), Drone technology, and Autonomous Robots, 5 G, and blockchain to offer digital transformation, research and development and service delivery. Disruptive technologies are essential for Industry 4.0 development, which can be applied to many disciplines. In this paper, we present a framework that uses disruptive technologies for COVID-19 analysis. The proposed framework restricts the spread of COVID-19 outbreaks, ensures the safety of the healthcare teams and maintains patients' physical and psychological healthcare conditions. The framework is designed to deal with the severe shortage of PPE for the medical team, reduce the massive pressure on hospitals, and track recovered patients to treat COVID-19 patients with plasma. The study provides oversight for governments on how to adopt technologies to reduce the impact of unprecedented outbreaks for COVID-19. Our work illustrates an empirical case study on the analysis of real COVID-19 patients and shows the importance of the proposed intelligent framework to limit the current outbreaks for COVID-19. The aim is to help the healthcare team make rapid decisions to treat COVID-19 patients in hospitals, home quarantine, or identifying and treating patients with typical cold or flu.</p

    Coordinated Multi-Point Clustering Schemes: A Survey

    Full text link

    Optimal sensor placement for sewer capacity risk management

    Get PDF
    2019 Spring.Includes bibliographical references.Complex linear assets, such as those found in transportation and utilities, are vital to economies, and in some cases, to public health. Wastewater collection systems in the United States are vital to both. Yet effective approaches to remediating failures in these systems remains an unresolved shortfall for system operators. This shortfall is evident in the estimated 850 billion gallons of untreated sewage that escapes combined sewer pipes each year (US EPA 2004a) and the estimated 40,000 sanitary sewer overflows and 400,000 backups of untreated sewage into basements (US EPA 2001). Failures in wastewater collection systems can be prevented if they can be detected in time to apply intervention strategies such as pipe maintenance, repair, or rehabilitation. This is the essence of a risk management process. The International Council on Systems Engineering recommends that risks be prioritized as a function of severity and occurrence and that criteria be established for acceptable and unacceptable risks (INCOSE 2007). A significant impediment to applying generally accepted risk models to wastewater collection systems is the difficulty of quantifying risk likelihoods. These difficulties stem from the size and complexity of the systems, the lack of data and statistics characterizing the distribution of risk, the high cost of evaluating even a small number of components, and the lack of methods to quantify risk. This research investigates new methods to assess risk likelihood of failure through a novel approach to placement of sensors in wastewater collection systems. The hypothesis is that iterative movement of water level sensors, directed by a specialized metaheuristic search technique, can improve the efficiency of discovering locations of unacceptable risk. An agent-based simulation is constructed to validate the performance of this technique along with testing its sensitivity to varying environments. The results demonstrated that a multi-phase search strategy, with a varying number of sensors deployed in each phase, could efficiently discover locations of unacceptable risk that could be managed via a perpetual monitoring, analysis, and remediation process. A number of promising well-defined future research opportunities also emerged from the performance of this research

    Intelligent and Efficient Ultra-Dense Heterogeneous Networks for 5G and Beyond

    Get PDF
    Ultra-dense heterogeneous network (HetNet), in which densified small cells overlaying the conventional macro-cells, is a promising technique for the fifth-generation (5G) mobile network. The dense and multi-tier network architecture is able to support the extensive data traffic and diverse quality of service (QoS) but meanwhile arises several challenges especially on the interference coordination and resource management. In this thesis, three novel network schemes are proposed to achieve intelligent and efficient operation based on the deep learning-enabled network awareness. Both optimization and deep learning methods are developed to achieve intelligent and efficient resource allocation in these proposed network schemes. To improve the cost and energy efficiency of ultra-dense HetNets, a hotspot prediction based virtual small cell (VSC) network is proposed. A VSC is formed only when the traffic volume and user density are extremely high. We leverage the feature extraction capabilities of deep learning techniques and exploit a long-short term memory (LSTM) neural network to predict potential hotspots and form VSC. Large-scale antenna array enabled hybrid beamforming is also adaptively adjusted for highly directional transmission to cover these VSCs. Within each VSC, one user equipment (UE) is selected as a cell head (CH), which collects the intra-cell traffic using the unlicensed band and relays the aggregated traffic to the macro-cell base station (MBS) in the licensed band. The inter-cell interference can thus be reduced, and the spectrum efficiency can be improved. Numerical results show that proposed VSCs can reduce 55%55\% power consumption in comparison with traditional small cells. In addition to the smart VSCs deployment, a novel multi-dimensional intelligent multiple access (MD-IMA) scheme is also proposed to achieve stringent and diverse QoS of emerging 5G applications with disparate resource constraints. Multiple access (MA) schemes in multi-dimensional resources are adaptively scheduled to accommodate dynamic QoS requirements and network states. The MD-IMA learns the integrated-quality-of-system-experience (I-QoSE) by monitoring and predicting QoS through the LSTM neural network. The resource allocation in the MD-IMA scheme is formulated as an optimization problem to maximize the I-QoSE as well as minimize the non-orthogonality (NO) in view of implementation constraints. In order to solve this problem, both model-based optimization algorithms and model-free deep reinforcement learning (DRL) approaches are utilized. Simulation results demonstrate that the achievable I-QoSE gain of MD-IMA over traditional MA is 15%15\% - 18%18\%. In the final part of the thesis, a Software-Defined Networking (SDN) enabled 5G-vehicle ad hoc networks (VANET) is designed to support the growing vehicle-generated data traffic. In this integrated architecture, to reduce the signaling overhead, vehicles are clustered under the coordination of SDN and one vehicle in each cluster is selected as a gateway to aggregate intra-cluster traffic. To ensure the capacity of the trunk-link between the gateway and macro base station, a Non-orthogonal Multiplexed Modulation (NOMM) scheme is proposed to split aggregated data stream into multi-layers and use sparse spreading code to partially superpose the modulated symbols on several resource blocks. The simulation results show that the energy efficiency performance of proposed NOMM is around 1.5-2 times than that of the typical orthogonal transmission scheme

    Spatio-temporal crime HotSpot detection and prediction: a systematic literature review

    Get PDF
    The primary objective of this study is to accumulate, summarize, and evaluate the state-of-the-art for spatio-temporal crime hotspot detection and prediction techniques by conducting a systematic literature review (SLR). The authors were unable to find a comprehensive study on crime hotspot detection and prediction while conducting this SLR. Therefore, to the best of author's knowledge, this study is the premier attempt to critically analyze the existing literature along with presenting potential challenges faced by current crime hotspot detection and prediction systems. The SLR is conducted by thoroughly consulting top five scientific databases (such as IEEE, Science Direct, Springer, Scopus, and ACM), and synthesized 49 different studies on crime hotspot detection and prediction after critical review. This study unfolds the following major aspects: 1) the impact of data mining and machine learning approaches, especially clustering techniques in crime hotspot detection; 2) the utility of time series analysis techniques and deep learning techniques in crime trend prediction; 3) the inclusion of spatial and temporal information in crime datasets making the crime prediction systems more accurate and reliable; 4) the potential challenges faced by the state-of-the-art techniques and the future research directions. Moreover, the SLR aims to provide a core foundation for the research on spatio-temporal crime prediction applications while highlighting several challenges related to the accuracy of crime hotspot detection and prediction applications
    corecore