5,284 research outputs found
Sharing Computer Network Logs for Security and Privacy: A Motivation for New Methodologies of Anonymization
Logs are one of the most fundamental resources to any security professional.
It is widely recognized by the government and industry that it is both
beneficial and desirable to share logs for the purpose of security research.
However, the sharing is not happening or not to the degree or magnitude that is
desired. Organizations are reluctant to share logs because of the risk of
exposing sensitive information to potential attackers. We believe this
reluctance remains high because current anonymization techniques are weak and
one-size-fits-all--or better put, one size tries to fit all. We must develop
standards and make anonymization available at varying levels, striking a
balance between privacy and utility. Organizations have different needs and
trust other organizations to different degrees. They must be able to map
multiple anonymization levels with defined risks to the trust levels they share
with (would-be) receivers. It is not until there are industry standards for
multiple levels of anonymization that we will be able to move forward and
achieve the goal of widespread sharing of logs for security researchers.Comment: 17 pages, 1 figur
Cyber Threat Intelligence : Challenges and Opportunities
The ever increasing number of cyber attacks requires the cyber security and
forensic specialists to detect, analyze and defend against the cyber threats in
almost realtime. In practice, timely dealing with such a large number of
attacks is not possible without deeply perusing the attack features and taking
corresponding intelligent defensive actions, this in essence defines cyber
threat intelligence notion. However, such an intelligence would not be possible
without the aid of artificial intelligence, machine learning and advanced data
mining techniques to collect, analyse, and interpret cyber attack evidences. In
this introductory chapter we first discuss the notion of cyber threat
intelligence and its main challenges and opportunities, and then briefly
introduce the chapters of the book which either address the identified
challenges or present opportunistic solutions to provide threat intelligence.Comment: 5 Page
Parameters of the attenuated schistosome vaccine evaluated in the olive baboon
Five exposures of baboons to the attenuated schistosome vaccine gave greater protection than three exposures, but this attenuation was not sustained when challenge was delayed. Within the scope of the data collected, fecal egg counts and circulating antigen levels did not accurately predict the observed worm burdens. Levels of immunoglobulin G at challenge correlated best with protection, but there was little evidence of a recall response
Inferring malicious network events in commercial ISP networks using traffic summarisation
With the recent increases in bandwidth available to home users, traffic rates for
commercial national networks have also been increasing rapidly. This presents
a problem for any network monitoring tool as the traffic rate they are expected
to monitor is rising on a monthly basis. Security within these networks is para-
mount as they are now an accepted home of trade and commerce. Core networks
have been demonstrably and repeatedly open to attack; these events have had
significant material costs to high profile targets.
Network monitoring is an important part of network security, providing in-
formation about potential security breaches and in understanding their impact.
Monitoring at high data rates is a significant problem; both in terms of processing
the information at line rates, and in terms of presenting the relevant information
to the appropriate persons or systems.
This thesis suggests that the use of summary statistics, gathered over a num-
ber of packets, is a sensible and effective way of coping with high data rates. A
methodology for discovering which metrics are appropriate for classifying signi-
ficant network events using statistical summaries is presented. It is shown that
the statistical measures found with this methodology can be used effectively as
a metric for defining periods of significant anomaly, and further classifying these
anomalies as legitimate or otherwise. In a laboratory environment, these metrics
were used to detect DoS traffic representing as little as 0.1% of the overall network
traffic.
The metrics discovered were then analysed to demonstrate that they are ap-
propriate and rational metrics for the detection of network level anomalies. These
metrics were shown to have distinctive characteristics during DoS by the analysis
of live network observations taken during DoS events.
This work was implemented and operated within a live system, at multiple
sites within the core of a commercial ISP network. The statistical summaries
are generated at city based points of presence and gathered centrally to allow for
spacial and topological correlation of security events.
The architecture chosen was shown to be
exible in its application. The system
was used to detect the level of VoIP traffic present on the network through the
implementation of packet size distribution analysis in a multi-gigabit environment.
It was also used to detect unsolicited SMTP generators injecting messages into
the core.
ii
Monitoring in a commercial network environment is subject to data protec-
tion legislation. Accordingly the system presented processed only network and
transport layer headers, all other data being discarded at the capture interface.
The system described in this thesis was operational for a period of 6 months,
during which a set of over 140 network anomalies, both malicious and benign were
observed over a range of localities. The system design, example anomalies and
metric analysis form the majority of this thesis
Command & Control: Understanding, Denying and Detecting - A review of malware C2 techniques, detection and defences
In this survey, we first briefly review the current state of cyber attacks,
highlighting significant recent changes in how and why such attacks are
performed. We then investigate the mechanics of malware command and control
(C2) establishment: we provide a comprehensive review of the techniques used by
attackers to set up such a channel and to hide its presence from the attacked
parties and the security tools they use. We then switch to the defensive side
of the problem, and review approaches that have been proposed for the detection
and disruption of C2 channels. We also map such techniques to widely-adopted
security controls, emphasizing gaps or limitations (and success stories) in
current best practices.Comment: Work commissioned by CPNI, available at c2report.org. 38 pages.
Listing abstract compressed from version appearing in repor
An extended Kalman filtering approach to modeling nonlinear dynamic gene regulatory networks via short gene expression time series
Copyright [2009] IEEE. This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of Brunel University's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to [email protected]. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.In this paper, the extended Kalman filter (EKF) algorithm is applied to model the gene regulatory network from gene time series data. The gene regulatory network is considered as a nonlinear dynamic stochastic model that consists of the gene measurement equation and the gene regulation equation. After specifying the model structure, we apply the EKF algorithm for identifying both the model parameters and the actual value of gene expression levels. It is shown that the EKF algorithm is an online estimation algorithm that can identify a large number of parameters (including parameters of nonlinear functions) through iterative procedure by using a small number of observations. Four real-world gene expression data sets are employed to demonstrate the effectiveness of the EKF algorithm, and the obtained models are evaluated from the viewpoint of bioinformatics
Analisa Pendeteksian Worm Dan Trojan Pada Jaringan Internet Universitas Semarang Menggunakan Metode Kalsifikasi Pada Data Mining C45 Dan Bayesian Network
Worm attacks become a dangerous threat and cause damage in the Internet network. If the Internet network worms and trojan attacks the very disruption of traffic data as well as create bandwidth capacity has increased and wasted making the Internet connection is slow. Detecting worms and trojan on the Internet network, especially new variants of worms and trojans and worms and trojans hidden is still a challenging problem. Worm and trojan attacks generally occur in computer networks or the Internet which has a low level of security and vulnerable to infection. The detection and analysis of the worm and trojan attacks in the Internet network can be done by looking at the anomalies in Internet traffic and internet protocol addresses are accessed.This research used experimental research applying C4.5 and Bayesian Network methods to accurately classify anomalies in network traffic internet. Analysis of classification is applied to an internet address, internet protocol and internet bandwidth that allegedly attacked and trojan worm attacks.The results of this research is a result of analysis and classification of internet addresses, internet protocol and internet bandwidth to get the attack worms and trojans
- …