8 research outputs found

    How to Validate Traffic Generators?

    Get PDF
    Abstract-Network traffic generators are widely used in networking research and they are validated by a very broad range of metrics (mainly traffic characteristics). In this paper we overview the state of the art of these metrics and unveil that there is no consensus in the research community how to validate these traffic generators and which metric to choose for validation purpose. This situation makes it extremely difficult to evaluate validation results and compare different traffic generators. We advocate the research for finding a common set of metrics for the validation and comparative evaluation of traffic generators

    On the Statistical Characterization of Flows in Internet Traffic with Application to Sampling

    Get PDF
    A new method of estimating some statistical characteristics of TCP flows in the Internet is developed in this paper. For this purpose, a new set of random variables (referred to as observables) is defined. When dealing with sampled traffic, these observables can easily be computed from sampled data. By adopting a convenient mouse/elephant dichotomy also dependent on traffic, it is shown how these variables give a reliable statistical representation of the number of packets transmitted by large flows during successive time intervals with an appropriate duration. A mathematical framework is developed to estimate the accuracy of the method. As an application, it is shown how one can estimate the number of large TCP flows when only sampled traffic is available. The algorithm proposed is tested against experimental data collected from different types of IP networks

    Who wrote this scientific text?

    No full text
    The IEEE bibliographic database contains a number of proven duplications with indication of the original paper(s) copied. This corpus is used to test a method for the detection of hidden intertextuality (commonly named "plagiarism"). The intertextual distance, combined with the sliding window and with various classification techniques, identifies these duplications with a very low risk of error. These experiments also show that several factors blur the identity of the scientific author, including variable group authorship and the high levels of intertextuality accepted, and sometimes desired, in scientific papers on the same topic

    L'intertextualité dans les publications scientifiques

    No full text
    La base de données bibliographiques de l'IEEE contient un certain nombre de duplications avérées avec indication des originaux copiés. Ce corpus est utilisé pour tester une méthode d'attribution d'auteur. La combinaison de la distance intertextuelle avec la fenêtre glissante et diverses techniques de classification permet d'identifier ces duplications avec un risque d'erreur très faible. Cette expérience montre également que plusieurs facteurs brouillent l'identité de l'auteur scientifique, notamment des collectifs de chercheurs à géométrie variable et une forte dose d'intertextualité acceptée voire recherchée

    Workload Modeling for Computer Systems Performance Evaluation

    Full text link

    Using LiTGen, a realistic IP traffic model, to evaluate the impact of burstiness on performance

    No full text
    International audienceFor practical reasons, network simulators have to be designed on traffic models as realistic as possible. This paper presents the evaluation of LiTGen, a realistic IP traffic model, for the generation of IP traffic with accurate time scale properties and performance. We confront LiTGen against real data traces using two methods of evaluation. These methods respectively allow to observe the causes and consequences of the traffic burstiness. Using a wavelet spectrum analysis, we first highlight the intrinsic characteristics of the traffic and show LiTGen's ability to reproduce accurately the captured traffic correlation structures over a wide range of timescales. Then, a performance analysis based on simulations quantifies the impact of these characteristics on a simple queuing system, and demonstrates LiTGen's ability to generate synthetic traffic leading to realistic performance. Finally, we conduct an investigation for a possible model reduction using memoryless assumptions

    Dezentrale, Anomalie-basierte Erkennung verteilter Angriffe im Internet

    Get PDF
    Die mittlerweile unabdingbare Verfügbarkeit des Internets wird zunehmend durch finanziell motivierte, verteilte Angriffe gestört. Deren schnelle und flächendeckende Erkennung als notwendige Voraussetzung für effektive Gegenmaßnahmen ist Ziel dieser Arbeit. Hierzu werden neue Mechanismen zur Identifikation von Angriffen und zur dezentralen domänenübergreifenden Kooperation verteilter Erkennungssysteme entworfen. Zudem werden die für die realitätsnahe Evaluierung notwendigen Werkzeuge entwickelt
    corecore