1,993,345 research outputs found

    Realistic Traffic Generation for Web Robots

    Full text link
    Critical to evaluating the capacity, scalability, and availability of web systems are realistic web traffic generators. Web traffic generation is a classic research problem, no generator accounts for the characteristics of web robots or crawlers that are now the dominant source of traffic to a web server. Administrators are thus unable to test, stress, and evaluate how their systems perform in the face of ever increasing levels of web robot traffic. To resolve this problem, this paper introduces a novel approach to generate synthetic web robot traffic with high fidelity. It generates traffic that accounts for both the temporal and behavioral qualities of robot traffic by statistical and Bayesian models that are fitted to the properties of robot traffic seen in web logs from North America and Europe. We evaluate our traffic generator by comparing the characteristics of generated traffic to those of the original data. We look at session arrival rates, inter-arrival times and session lengths, comparing and contrasting them between generated and real traffic. Finally, we show that our generated traffic affects cache performance similarly to actual traffic, using the common LRU and LFU eviction policies.Comment: 8 page

    Statistical Traffic State Analysis in Large-scale Transportation Networks Using Locality-Preserving Non-negative Matrix Factorization

    Get PDF
    Statistical traffic data analysis is a hot topic in traffic management and control. In this field, current research progresses focus on analyzing traffic flows of individual links or local regions in a transportation network. Less attention are paid to the global view of traffic states over the entire network, which is important for modeling large-scale traffic scenes. Our aim is precisely to propose a new methodology for extracting spatio-temporal traffic patterns, ultimately for modeling large-scale traffic dynamics, and long-term traffic forecasting. We attack this issue by utilizing Locality-Preserving Non-negative Matrix Factorization (LPNMF) to derive low-dimensional representation of network-level traffic states. Clustering is performed on the compact LPNMF projections to unveil typical spatial patterns and temporal dynamics of network-level traffic states. We have tested the proposed method on simulated traffic data generated for a large-scale road network, and reported experimental results validate the ability of our approach for extracting meaningful large-scale space-time traffic patterns. Furthermore, the derived clustering results provide an intuitive understanding of spatial-temporal characteristics of traffic flows in the large-scale network, and a basis for potential long-term forecasting.Comment: IET Intelligent Transport Systems (2013

    Analyzing Network Traffic for Malicious Hacker Activity

    Get PDF
    Since the Internet came into life in the 1970s, it has been growing more than 100% every year. On the other hand, the solutions to detecting network intrusion are far outpaced. The economic impact of malicious attacks in lost revenue to a single e-commerce company can vary from 66 thousand up to 53 million US dollars. At the same time, there is no effective mathematical model widely available to distinguish anomaly network behaviours such as port scanning, system exploring, virus and worm propagation from normal traffic. PDS proposed by Random Knowledge Inc., detects and localizes traffic patterns consistent with attacks hidden within large amounts of legitimate traffic. With the network’s packet traffic stream being its input, PDS relies on high fidelity models for normal traffic from which it can critically judge the legitimacy of any substream of packet traffic. Because of the reliability on an accurate baseline model for normal network traffic, in this workshop, we concentrate on modelling normal network traffic with a Poisson process

    CARBOTRAF: A decision Support system for reducing pollutant emissions by adaptive traffic management

    Get PDF
    Traffic congestion with frequent “stop & go” situations causes substantial pollutant emissions. Black carbon (BC) is a good indicator of combustion-related air pollution and results in negative health effects. Both BC and CO2 emissions are also known to contribute significantly to global warming. Current traffic control systems are designed to improve traffic flow and reduce congestion. The CARBOTRAF system combines real-time monitoring of traffic and air pollution with simulation models for emission and local air quality prediction in order to deliver on-line recommendations for alternative adaptive traffic management. The aim of introducing a CARBOTRAF system is to reduce BC and CO2 emissions and improve air quality by optimizing the traffic flows. The system is implemented and evaluated in two pilot cities, Graz and Glasgow. Model simulations link traffic states to emission and air quality levels. A chain of models combines micro-scale traffic simulations, traffic volumes, emission models and air quality simulations. This process is completed for several ITS scenarios and a range of traffic boundary conditions. The real-time DSS system uses all these model simulations to select optimal traffic and air quality scenarios. Traffic and BC concentrations are simultaneously monitored. In this paper the effects of ITS measures on air quality are analysed with a focus on BC
    corecore