366 research outputs found

    DFCV: A Novel Approach for Message Dissemination in Connected Vehicles using Dynamic Fog

    Full text link
    Vehicular Ad-hoc Network (VANET) has emerged as a promising solution for enhancing road safety. Routing of messages in VANET is challenging due to packet delays arising from high mobility of vehicles, frequently changing topology, and high density of vehicles, leading to frequent route breakages and packet losses. Previous researchers have used either mobility in vehicular fog computing or cloud computing to solve the routing issue, but they suffer from large packet delays and frequent packet losses. We propose Dynamic Fog for Connected Vehicles (DFCV), a fog computing based scheme which dynamically creates, increments and destroys fog nodes depending on the communication needs. The novelty of DFCV lies in providing lower delays and guaranteed message delivery at high vehicular densities. Simulations were conducted using hybrid simulation consisting of ns-2, SUMO, and Cloudsim. Results show that DFCV ensures efficient resource utilization, lower packet delays and losses at high vehicle densities

    Hybrid-Vehcloud: An Obstacle Shadowing Approach for VANETs in Urban Environment

    Full text link
    Routing of messages in Vehicular Ad-hoc Networks (VANETs) is challenging due to obstacle shadowing regions with high vehicle densities, which leads to frequent disconnection problems and blocks radio wave propagation between vehicles. Previous researchers used multi-hop, vehicular cloud or roadside infrastructures to solve the routing issue among the vehicles, but they suffer from significant packet delays and frequent packet losses arising from obstacle shadowing. We proposed a vehicular cloud based hybrid technique called Hybrid-Vehcloud to disseminate messages in obstacle shadowing regions, and multi-hop technique to disseminate messages in non-obstacle shadowing regions. The novelty of our approach lies in the fact that our proposed technique dynamically adapts between obstacle shadowing and non-obstacle shadowing regions. Simulation based performance analysis of Hybrid-Vehcloud showed improved performance over Cloud-assisted Message Downlink Dissemination Scheme (CMDS), Cross-Layer Broadcast Protocol (CLBP) and Cloud-VANET schemes at high vehicle densities

    Stable Dynamic Predictive Clustering (SDPC) Protocol for Vehicular Ad hoc Network

    Full text link
    Vehicular communication is an essential part of a smart city. Scalability is a major issue for vehicular communication, specially, when the number of vehicles increases at any given point. Vehicles also suffer some other problems such as broadcast problem. Clustering can solve the issues of vehicular ad hoc network (VANET); however, due to the high mobility of the vehicles, clustering in VANET suffers stability issue. Previously proposed clustering algorithms for VANET are optimized for either straight road or for intersection. Moreover, the absence of the intelligent use of a combination of the mobility parameters, such as direction, movement, position, velocity, degree of vehicle, movement at the intersection etc., results in cluster stability issues. A dynamic clustering algorithm considering the efficient use of all the mobility parameters can solve the stability problem in VANET. To achieve higher stability for VANET, a novel robust and dynamic clustering algorithm stable dynamic predictive clustering (SDPC) for VANET is proposed in this paper. In contrast to previous studies, vehicle relative velocity, vehicle position, vehicle distance, transmission range, and vehicle density are considered in the creation of a cluster, whereas relative distance, movement at the intersection, degree of vehicles are considered to select the cluster head. From the mobility parameters the future road scenario is constructed. The cluster is created, and the cluster head is selected based on the future construction of the road. The performance of SDPC is compared in terms of the average cluster head change rate, the average cluster head duration, the average cluster member duration, and the ratio of clustering overhead in terms of total packet transmission. The simulation result shows SDPC outperforms the existing algorithms and achieved better clustering stability

    A Survey on Congestion Control and Scheduling for Multipath TCP: Machine Learning vs Classical Approaches

    Full text link
    Multipath TCP (MPTCP) has been widely used as an efficient way for communication in many applications. Data centers, smartphones, and network operators use MPTCP to balance the traffic in a network efficiently. MPTCP is an extension of TCP (Transmission Control Protocol), which provides multiple paths, leading to higher throughput and low latency. Although MPTCP has shown better performance than TCP in many applications, it has its own challenges. The network can become congested due to heavy traffic in the multiple paths (subflows) if the subflow rates are not determined correctly. Moreover, communication latency can occur if the packets are not scheduled correctly between the subflows. This paper reviews techniques to solve the above-mentioned problems based on two main approaches; non data-driven (classical) and data-driven (Machine Learning) approaches. This paper compares these two approaches and highlights their strengths and weaknesses with a view to motivating future researchers in this exciting area of machine learning for communications. This paper also provides details on the simulation of MPTCP and its implementations in real environments.Comment: 13 pages, 7 figure

    Improving Reliability of Hydrological Flow Estimation using Hydroinformatics Approach

    Full text link
    University of Technology Sydney. Faculty of Engineering and Information Technology.Application of hydroinformatics tools in water resources has been very common in water industry due to the rapid advancement of digital computer. Over the last few decades, there are several tools have been developed and applied with success. The most commonly used Artificial Intelligence (AI) based hydroinformatics tools in hydrology are Genetic Programming (GP), Artificial Neural Network (ANN), Fuzzy Logic (FL), Standard Chaos Technique, Inverse Approach, Support Vector Machine (SVM) and Evolutionary Computation (Genetic Algorithm (GA), Shuffled Complex Evolution (SCE), Particle Swarm Optimization (PSO), Ant Colony Optimization Algorithm (ACOA)) based AI techniques including SVM (EC-SVM). These tools including Genetic Programming (GP) have been proven to be efficient in prediction of flows from event based rainfalls series. The driving factor behind the application of hydroinformatics tools was to ease the complex numerical modelling process. In principal, both conceptual and physically based distributed models require a large number of parameters such as catchment characteristics, losses, flow paths, meteorological and flow data. The values of some of these parameters are evaluated through calibration. The calibration process of complex models may be cumbersome and requires considerable effort and experience particularly when the number of the calibration parameters is large. Even though the model is calibrated, the application of the parameters is catchment specific. Model parameters from one catchment may not be representative for the other catchment. In this case, hydroinformatics tools like GP and/or ANN can be used where no parameters associated with catchment and soil characteristic are necessary. GP has been successfully applied for calibration of numerous event based rainfall and runoff models. However, application of GP for the prediction of long term time series is limited. The application of GP for long term runoff prediction from a dam catchment is demonstrated. The model is developed and calibrated for a dam catchment located in New South Wales, Australia. The calibration shows excellent agreement between the observed and simulated flows recorded over thirty years and the results are better than traditional Sacramento model and ANN. GP is also linked to MIKE11-NAM to build a hybrid model. The concept of this hybrid model is to fill the data gaps and generate long term (100 years) predictions. The calibrated GP model is then applied for the assessment of two future rainfall scenarios where future hundred year flows are predicted using rainfall input generated from different assumed climatic conditions. The analysis results provide some basis for making future water management plans including water supply from alternative sources. While the application was successful and produced better results, it was found that GP suffered from computational overhead in the learning process from input data. To improve the prediction accuracy, relatively new AI technique, called Extreme Learning Machine (ELM) is proposed. ELM is applied to partly overcome the slow learning problems of GP and ANN and to predict the hydrological time-series very quickly. ELM, which is also called single-hidden layer feed-forward neural networks (SLFNs), is able to well generalize the performance for extremely complex problems. ELM randomly chooses a single hidden layer and analytically determines the weights to predict the output. The ELM method was applied to predict hydrological flow series for the Tryggevælde Catchment, Denmark and for the Mississippi River at Vicksburg, USA. The results confirmed that ELM’s performance was similar or better in terms of Root Mean Square Error (RMSE) and Normalized Root Mean Square Error (NRMSE) compared to ANN and other previously published techniques, namely Evolutionary Computation based Support Vector Machine (EC-SVM), Standard Chaotic Approach and Inverse Approach. In this analysis, the sensitivity of ELM’s input parameters on the prediction accuracy were not investigated. The influence of input parameters was then analysed to further improve the model results. The robustness of ELM’s performances based on number of lagged input variables, the number of hidden nodes in ELM, higher lead days prediction and extrapolation capability using four goodness-of-fit measures is demonstrated. The results show that (1) ELM yields reasonable results with all combinations of lagged input variables (flows) for 1-day lead prediction. The minimum errors were obtained when 4-day lagged flows were applied as input variables; (2) ELM produced satisfactory results very rapidly for any number of hidden nodes ranging from ten to six thousand in the hidden layer. The time required to train ELM varies from less than a second to two minutes as only single iteration is required. A larger number of hidden nodes generally gives slightly better results; (3) ELM generated reasonable results for higher number of lead days (second and third) predictions; (4) ELM was able to extrapolate when the highest magnitude of input variables were excluded from training dataset; (5) ELM was shown to be computationally much faster and capable of producing better results compared with GP and EC-SVM for prediction of flow series from the same catchment. This demonstrates ELM potential for forecasting real-time hydrological time-series. This analysis was based on node based ELM (NELM) method. The performance of ELM is further improved by introducing Kernel function (KELM) in the learning process in the subsequent analysis. In addition to node based ELM, Kernel based ELM (KELM) is also applied. The performance of KELM was also compared against hidden node based ELM (NELM). The predictive capabilities of both NELM and KELM were investigated using data from three different catchments located in three different climatic regions (Tryggevælde catchment, Denmark, Mississippi River at Vicksburg, USA and Duckmaloi Weir catchment, Australia). The results were compared with those obtained with Genetic Programming (GP) and evolutionary computation based Support Vector Machine (EC-SVM), the later obtained from literature. The results show that KELM predictions were better than NELM, GP and EC-SVM. KELM ran faster than any other model. ELM’s fast learning capability from a training dataset for the prediction of hydrological flows means that it would be more suitable for on-line and real-time applications where quick processing time is important or vital. The study demonstrates ELM’s ability for rapid prediction and has potential application in real-time forecasting and in water resources planning and management

    Blockchain-based Security Framework for Critical Industry 4.0 Cyber-physical System

    Get PDF
    There has been an intense concern for security alternatives because of the recent rise of cyber attacks, mainly targeting critical systems such as industry, medical, or energy ecosystem. Though the latest industry infrastructures largely depend on AI-driven maintenance, the prediction based on corrupted data undoubtedly results in loss of life and capital. Admittedly, an inadequate data-protection mechanism can readily challenge the security and reliability of the network. The shortcomings of the conventional cloud or trusted certificate-driven techniques have motivated us to exhibit a unique Blockchain-based framework for a secure and efficient industry 4.0 system. The demonstrated framework obviates the long-established certificate authority after enhancing the consortium Blockchain that reduces the data processing delay, and increases cost-effective throughput. Nonetheless, the distributed industry 4.0 security model entails cooperative trust than depending on a single party, which in essence indulges the costs and threat of the single point of failure. Therefore, multi-signature technique of the proposed framework accomplishes the multi-party authentication, which confirms its applicability for the real-time and collaborative cyber-physical system.Comment: 07 Pages, 4 Figures, IEEE Communication Magazin
    • …
    corecore