5,107 research outputs found
Mathematics in Internet Traffic Data Analysis
The Internet traffic data have been found to possess extreme variability and bursty structures in a wide range of time-scales, so that there is no definite duration of busy or silent periods. But there is a self-similarity for which it is possible to characterize the data. The self-similar nature was first proposed by Leland et a1 [l] and subsequently established by others in a flood of research works on the subject [2]-[5]. It was then a new concept against the long believed idea of Poisson traffic. The traditional Poison model, a short ranged process, assumed the variation of data flow to be finite but the observations on Internet traffic proved otherwise. It is this large variance that leads to the self-similar nature of the data almost at all scales of resolution. Such a feature is always associated with a fractal structure of the data. The fractal characteristics can exist both in temporal and spatial scales. This was indicated by Willinger and Paxson [6], as due to the extreme variability and long range dependence in the process. Presently, one of the main research interests in the field of Internet traffic is that of prediction of data which will help a network manager to render a satisfactory quality of service. Before preparing a model of prediction, one of the important tasks is to determine its statistics. Any model to predict the future values will have to preserve these characteristics
A critical look at power law modelling of the Internet
This paper takes a critical look at the usefulness of power law models of the
Internet. The twin focuses of the paper are Internet traffic and topology
generation. The aim of the paper is twofold. Firstly it summarises the state of
the art in power law modelling particularly giving attention to existing open
research questions. Secondly it provides insight into the failings of such
models and where progress needs to be made for power law research to feed
through to actual improvements in network performance.Comment: To appear Computer Communication
Traffic measurement and analysis
Measurement and analysis of real traffic is important to gain knowledge
about the characteristics of the traffic. Without measurement, it is
impossible to build realistic traffic models. It is recent that data
traffic was found to have self-similar properties. In this thesis work
traffic captured on the network at SICS and on the Supernet, is shown to
have this fractal-like behaviour. The traffic is also examined with
respect to which protocols and packet sizes are present and in what
proportions. In the SICS trace most packets are small, TCP is shown to be
the predominant transport protocol and NNTP the most common application.
In contrast to this, large UDP packets sent between not well-known ports
dominates the Supernet traffic. Finally, characteristics of the client
side of the WWW traffic are examined more closely. In order to extract
useful information from the packet trace, web browsers use of TCP and HTTP
is investigated including new features in HTTP/1.1 such as persistent
connections and pipelining. Empirical probability distributions are
derived describing session lengths, time between user clicks and the
amount of data transferred due to a single user click. These probability
distributions make up a simple model of WWW-sessions
The Methods to Improve Quality of Service by Accounting Secure Parameters
A solution to the problem of ensuring quality of service, providing a greater
number of services with higher efficiency taking into account network security
is proposed. In this paper, experiments were conducted to analyze the effect of
self-similarity and attacks on the quality of service parameters. Method of
buffering and control of channel capacity and calculating of routing cost
method in the network, which take into account the parameters of traffic
multifractality and the probability of detecting attacks in telecommunications
networks were proposed. The both proposed methods accounting the given
restrictions on the delay time and the number of lost packets for every type
quality of service traffic. During simulation the parameters of transmitted
traffic (self-similarity, intensity) and the parameters of network (current
channel load, node buffer size) were changed and the maximum allowable load of
network was determined. The results of analysis show that occurrence of
overload when transmitting traffic over a switched channel associated with
multifractal traffic characteristics and presence of attack. It was shown that
proposed methods can reduce the lost data and improve the efficiency of network
resources.Comment: 10 pages, 1 figure, 1 equation, 1 table. arXiv admin note: text
overlap with arXiv:1904.0520
- …