3,060 research outputs found
Unsupervised two-class and multi-class support vector machines for abnormal traffic characterization
Although measurement-based real-time traffic classification has received considerable research attention, the timing constraints imposed by the high accuracy requirements and the learning phase of the algorithms employed still remain a challenge. In this paper we propose a measurement-based classification framework that exploits unsupervised learning to accurately categorise network anomalies to specific classes. We introduce the combinatorial use of two-class and multi-class unsupervised Support Vector Machines (SVM)s to first distinguish normal from anomalous traffic and to further classify the latter category to individual groups depending on the nature of the anomaly
Dynamical Patterns of Cattle Trade Movements
Despite their importance for the spread of zoonotic diseases, our
understanding of the dynamical aspects characterizing the movements of farmed
animal populations remains limited as these systems are traditionally studied
as static objects and through simplified approximations. By leveraging on the
network science approach, here we are able for the first time to fully analyze
the longitudinal dataset of Italian cattle movements that reports the mobility
of individual animals among farms on a daily basis. The complexity and
inter-relations between topology, function and dynamical nature of the system
are characterized at different spatial and time resolutions, in order to
uncover patterns and vulnerabilities fundamental for the definition of targeted
prevention and control measures for zoonotic diseases. Results show how the
stationarity of statistical distributions coexists with a strong and
non-trivial evolutionary dynamics at the node and link levels, on all
timescales. Traditional static views of the displacement network hide important
patterns of structural changes affecting nodes' centrality and farms' spreading
potential, thus limiting the efficiency of interventions based on partial
longitudinal information. By fully taking into account the longitudinal
dimension, we propose a novel definition of dynamical motifs that is able to
uncover the presence of a temporal arrow describing the evolution of the system
and the causality patterns of its displacements, shedding light on mechanisms
that may play a crucial role in the definition of preventive actions
Dynamical Patterns of Cattle Trade Movements
Despite their importance for the spread of zoonotic diseases, our
understanding of the dynamical aspects characterizing the movements of farmed
animal populations remains limited as these systems are traditionally studied
as static objects and through simplified approximations. By leveraging on the
network science approach, here we are able for the first time to fully analyze
the longitudinal dataset of Italian cattle movements that reports the mobility
of individual animals among farms on a daily basis. The complexity and
inter-relations between topology, function and dynamical nature of the system
are characterized at different spatial and time resolutions, in order to
uncover patterns and vulnerabilities fundamental for the definition of targeted
prevention and control measures for zoonotic diseases. Results show how the
stationarity of statistical distributions coexists with a strong and
non-trivial evolutionary dynamics at the node and link levels, on all
timescales. Traditional static views of the displacement network hide important
patterns of structural changes affecting nodes' centrality and farms' spreading
potential, thus limiting the efficiency of interventions based on partial
longitudinal information. By fully taking into account the longitudinal
dimension, we propose a novel definition of dynamical motifs that is able to
uncover the presence of a temporal arrow describing the evolution of the system
and the causality patterns of its displacements, shedding light on mechanisms
that may play a crucial role in the definition of preventive actions
SENATUS: An Approach to Joint Traffic Anomaly Detection and Root Cause Analysis
In this paper, we propose a novel approach, called SENATUS, for joint traffic
anomaly detection and root-cause analysis. Inspired from the concept of a
senate, the key idea of the proposed approach is divided into three stages:
election, voting and decision. At the election stage, a small number of
\nop{traffic flow sets (termed as senator flows)}senator flows are chosen\nop{,
which are used} to represent approximately the total (usually huge) set of
traffic flows. In the voting stage, anomaly detection is applied on the senator
flows and the detected anomalies are correlated to identify the most possible
anomalous time bins. Finally in the decision stage, a machine learning
technique is applied to the senator flows of each anomalous time bin to find
the root cause of the anomalies. We evaluate SENATUS using traffic traces
collected from the Pan European network, GEANT, and compare against another
approach which detects anomalies using lossless compression of traffic
histograms. We show the effectiveness of SENATUS in diagnosing anomaly types:
network scans and DoS/DDoS attacks
The Dynamics of Internet Traffic: Self-Similarity, Self-Organization, and Complex Phenomena
The Internet is the most complex system ever created in human history.
Therefore, its dynamics and traffic unsurprisingly take on a rich variety of
complex dynamics, self-organization, and other phenomena that have been
researched for years. This paper is a review of the complex dynamics of
Internet traffic. Departing from normal treatises, we will take a view from
both the network engineering and physics perspectives showing the strengths and
weaknesses as well as insights of both. In addition, many less covered
phenomena such as traffic oscillations, large-scale effects of worm traffic,
and comparisons of the Internet and biological models will be covered.Comment: 63 pages, 7 figures, 7 tables, submitted to Advances in Complex
System
Building an Emulation Environment for Cyber Security Analyses of Complex Networked Systems
Computer networks are undergoing a phenomenal growth, driven by the rapidly
increasing number of nodes constituting the networks. At the same time, the
number of security threats on Internet and intranet networks is constantly
growing, and the testing and experimentation of cyber defense solutions
requires the availability of separate, test environments that best emulate the
complexity of a real system. Such environments support the deployment and
monitoring of complex mission-driven network scenarios, thus enabling the study
of cyber defense strategies under real and controllable traffic and attack
scenarios. In this paper, we propose a methodology that makes use of a
combination of techniques of network and security assessment, and the use of
cloud technologies to build an emulation environment with adjustable degree of
affinity with respect to actual reference networks or planned systems. As a
byproduct, starting from a specific study case, we collected a dataset
consisting of complete network traces comprising benign and malicious traffic,
which is feature-rich and publicly available
ELASTICITY: Topological Characterization of Robustness in Complex Networks
Just as a herd of animals relies on its robust social structure to survive in
the wild, similarly robustness is a crucial characteristic for the survival of
a complex network under attack. The capacity to measure robustness in complex
networks defines the resolve of a network to maintain functionality in the
advent of classical component failures and at the onset of cryptic malicious
attacks. To date, robustness metrics are deficient and unfortunately the
following dilemmas exist: accurate models necessitate complex analysis while
conversely, simple models lack applicability to our definition of robustness.
In this paper, we define robustness and present a novel metric, elasticity- a
bridge between accuracy and complexity-a link in the chain of network
robustness. Additionally, we explore the performance of elasticity on Internet
topologies and online social networks, and articulate results
A New Method for Assessing the Resiliency of Large, Complex Networks
Designing resilient and reliable networks is a principle concern of planners and private firms. Traffic congestion whether recurring or as the result of some aperiodic event is extremely costly. This paper describes an alternative process and a model for analyzing the resiliency of networks that address some of the shortcomings of more traditional approaches – e.g., the four-step modeling process used in transportation planning. It should be noted that the authors do not view this as a replacement to current approaches but rather as a complementary tool designed to augment analysis capabilities. The process that is described in this paper for analyzing the resiliency of a network involves at least three steps: 1. assessment or identification of important nodes and links according to different criteria 2. verification of critical nodes and links based on failure simulations and 3. consequence. Raster analysis, graph-theory principles and GIS are used to develop a model for carrying out each of these steps. The methods are demonstrated using two, large interdependent networks for a metropolitan area in the United States.
- …