1,865 research outputs found

    Influence of multilayer traffic engineering timing parameters on network performance

    Get PDF
    Recent advances in optical networking technology have moved the state-of-the-art from manually installed fiber connections to fully automatic switched lightpaths. Multilayer Traffic Engineering (MTE) in an IP-over-Optical network allows to leverage rapid lightpath setup/teardown as a cross-layer traffic engineering technique. It enables on-the-fly reconfiguration of the IP layer logical topology and up/downgrade of the capacity of IP links. Together with classical IP layer routing techniques, MTE intelligently solves problems such as IP layer congestion and packet loss and it may optimize optical layer capacity usage and total network throughput. In this, the rate at which MTE can make adjustments to the network is limited by technology and stability concerns. We present some example MTE techniques and discuss how the timing parameters of these mechanisms impact perceived network performance

    Development and evaluation of cooperative intersection management algorithm under connected vehicles environment

    Get PDF
    Recent technological advancements in the automotive and transportation industry established a firm foundation for development and implementation of various automated and connected vehicle (C/AV) solutions around the globe. Wireless communication technologies such as the dedicated short-range communication (DSRC) protocol are enabling instantaneous information exchange between vehicles and infrastructure. Such information exchange produces tremendous benefits with the possibility to automate conventional traffic streams and enhance existing signal control strategies. While many promising studies in the area of signal control under connected vehicle (CV) environment have been introduced, they mainly offer solutions designed to operate a single isolated intersection or they require high technology penetration rates to operate in a safe and efficient manner. Applications designed to operate on a signalized corridor with imperfect market penetration rates of connected vehicle technology represent a bridge between conventional traffic control paradigm and fully automated corridors of the future. Assuming utilization of the connected vehicle environment and vehicle to infrastructure (V2I) technology, all vehicular and signal-related parameters are known and can be shared with the control agent to control automated vehicles while improving the mobility of the signalized corridor. This dissertation research introduces an intersection management strategy for a corridor with automated vehicles utilizing vehicular trajectory-driven optimization method. The Trajectory-driven Optimization for Automated Driving (TOAD) provides an optimal trajectory for automated vehicles while maintaining safe and uninterrupted movement of general traffic, consisting of regular unequipped vehicles. Signal status parameters such as cycle length and splits are continuously captured. At the same time, vehicles share their position information with the control agent. Both inputs are then used by the control algorithm to provide optimal trajectories for automated vehicles, resulting in the reduction of vehicle delay along the signalized corridor with fixed-time signal control. To determine the most efficient trajectory for automated vehicles, an evolutionary-based optimization is utilized. Influence of the prevailing traffic conditions is incorporated into a control algorithm using conventional data collection methods such as loop detectors, Bluetooth or Wi-Fi sensors to collect vehicle counts, travel time on corridor segments, and spot speed. Moreover, a short-term, artificial intelligence prediction model is developed to achieve reasonable deployment of data collection devices and provide accurate vehicle delay predictions producing realistic and highly-efficient longitudinal vehicle trajectories. The concept evaluation through microsimulation reveals significant mobility improvements compared to contemporary corridor management approach. The results for selected test-bed locations on signalized arterials in New Jersey reveals up to 19.5 % reduction in overall corridor travel time depending on different market penetration and lane configuration scenario. It is also discovered that operational scenarios with a possibility of utilizing reserved lanes for movement of automated vehicles further increases the effectiveness of the proposed algorithm. In addition, the proposed control algorithm is feasible under imperfect C/AV market penetrations showing mobility improvements even with low market penetration rates

    Using Machine Learning for Handover Optimization in Vehicular Fog Computing

    Full text link
    Smart mobility management would be an important prerequisite for future fog computing systems. In this research, we propose a learning-based handover optimization for the Internet of Vehicles that would assist the smooth transition of device connections and offloaded tasks between fog nodes. To accomplish this, we make use of machine learning algorithms to learn from vehicle interactions with fog nodes. Our approach uses a three-layer feed-forward neural network to predict the correct fog node at a given location and time with 99.2 % accuracy on a test set. We also implement a dual stacked recurrent neural network (RNN) with long short-term memory (LSTM) cells capable of learning the latency, or cost, associated with these service requests. We create a simulation in JAMScript using a dataset of real-world vehicle movements to create a dataset to train these networks. We further propose the use of this predictive system in a smarter request routing mechanism to minimize the service interruption during handovers between fog nodes and to anticipate areas of low coverage through a series of experiments and test the models' performance on a test set

    Intrusion detection in IoT networks using machine learning

    Get PDF
    The exponential growth of Internet of Things (IoT) infrastructure has introduced significant security challenges due to the large-scale deployment of interconnected devices. IoT devices are present in every aspect of our modern life; they are essential components of Industry 4.0, smart cities, and critical infrastructures. Therefore, the detection of attacks on this platform becomes necessary through an Intrusion Detection Systems (IDS). These tools are dedicated hardware devices or software that monitors a network to detect and automatically alert the presence of malicious activity. This study aimed to assess the viability of Machine Learning Models for IDS within IoT infrastructures. Five classifiers, encompassing a spectrum from linear models like Logistic Regression, Decision Trees from Trees Algorithms, Gaussian Naïve Bayes from Probabilistic models, Random Forest from ensemble family and Multi-Layer Perceptron from Artificial Neural Networks, were analysed. These models were trained using supervised methods on a public IoT attacks dataset, with three tasks ranging from binary classification (determining if a sample was part of an attack) to multiclassification of 8 groups of attack categories and the multiclassification of 33 individual attacks. Various metrics were considered, from performance to execution times and all models were trained and tuned using cross-validation of 10 k-folds. On the three classification tasks, Random Forest was found to be the model with best performance, at expenses of time consumption. Gaussian Naïve Bayes was the fastest algorithm in all classification¿s tasks, but with a lower performance detecting attacks. Whereas Decision Trees shows a good balance between performance and processing speed. Classifying among 8 attack categories, most models showed vulnerabilities to specific attack types, especially those in minority classes due to dataset imbalances. In more granular 33 attack type classifications, all models generally faced challenges, but Random Forest remained the most reliable, despite vulnerabilities. In conclusion, Machine Learning algorithms proves to be effective for IDS in IoT infrastructure, with Random Forest model being the most robust, but with Decision Trees offering a good balance between speed and performance.Objectius de Desenvolupament Sostenible::9 - Indústria, Innovació i Infraestructur
    corecore