7 research outputs found

    An Evaluation of Trading Bands as Indicators for Network Telescope Datasets

    Get PDF
    Large scale viral outbreaks such as Conficker, the Code Red worm and the Witty worm illustrate the importance of monitoring malevolent activity on the Internet. Careful monitoring of anomalous traffic allows organiza-tions to react appropriately and in a timely fashion to minimize economic damage. Network telescopes, a type of Internet monitor, provide ana-lysts with a way of decoupling anomalous traffic from legitimate traffic. Data from network telescopes is used by analysts to identify potential incidents by comparing recent trends with historical data. Analysis of network telescope datasets is complicated by the large quantity of data present, the number of subdivisions within the data and the uncertainty associated with received traffic. While there is considerable research being performed in the field of network telescopes little of this work is concerned with the analysis of alternative methods of incident identifi-cation. This paper considers trading bands, a subfield of technical analysis, as an approach to identifying potential Internet incidents such as worms. Trading bands construct boundaries that are used for meas-uring when certain quantities are high or low relative to recent values. This paper considers Bollinger Bands and associated Bollinger Indica-tors, Price Channels and Keltner Channels. These techniques are evaluated as indicators of malevolent activity by considering how these techniques react to incidents indentified in the captured data from a network telescope

    Advances in modern botnet understanding and the accurate enumeration of infected hosts

    Get PDF
    Botnets remain a potent threat due to evolving modern architectures, inadequate remediation methods, and inaccurate measurement techniques. In response, this re- search exposes the architectures and operations of two advanced botnets, techniques to enumerate infected hosts, and pursues the scientific refinement of infected-host enu- meration data by recognizing network structures which distort measurement. This effort is motivated by the desire to reveal botnet behavior and trends for future mit- igation, methods to discover infected hosts for remediation in real time and threat assessment, and the need to reveal the inaccuracy in population size estimation when only counting IP addresses. Following an explanation of theoretical enumeration techniques, the architectures, deployment methodologies, and malicious output for the Storm and Waledac botnets are presented. Several tools developed to enumerate these botnets are then assessed in terms of performance and yield. Finally, this study documents methods that were developed to discover the boundaries and impact of NAT and DHCP blocks in network populations along with a footprint measurement based on relative entropy which better describes how uniformly infections communi- cate through their IP addresses. Population data from the Waledac botnet was used to evaluate these techniqu

    An exploratory study of techniques in passive network telescope data analysis

    Get PDF
    Careful examination of the composition and concentration of malicious traffic in transit on the channels of the Internet provides network administrators with a means of understanding and predicting damaging attacks directed towards their networks. This allows for action to be taken to mitigate the effect that these attacks have on the performance of their networks and the Internet as a whole by readying network defences and providing early warning to Internet users. One approach to malicious traffic monitoring that has garnered some success in recent times, as exhibited by the study of fast spreading Internet worms, involves analysing data obtained from network telescopes. While some research has considered using measures derived from network telescope datasets to study large scale network incidents such as Code-Red, SQLSlammer and Conficker, there is very little documented discussion on the merits and weaknesses of approaches to analyzing network telescope data. This thesis is an introductory study in network telescope analysis and aims to consider the variables associated with the data received by network telescopes and how these variables may be analysed. The core research of this thesis considers both novel and previously explored analysis techniques from the fields of security metrics, baseline analysis, statistical analysis and technical analysis as applied to analysing network telescope datasets. These techniques were evaluated as approaches to recognize unusual behaviour by observing the ability of these techniques to identify notable incidents in network telescope dataset

    An exploratory study of techniques in passive network telescope data analysis

    Get PDF
    Careful examination of the composition and concentration of malicious traffic in transit on the channels of the Internet provides network administrators with a means of understanding and predicting damaging attacks directed towards their networks. This allows for action to be taken to mitigate the effect that these attacks have on the performance of their networks and the Internet as a whole by readying network defences and providing early warning to Internet users. One approach to malicious traffic monitoring that has garnered some success in recent times, as exhibited by the study of fast spreading Internet worms, involves analysing data obtained from network telescopes. While some research has considered using measures derived from network telescope datasets to study large scale network incidents such as Code-Red, SQLSlammer and Conficker, there is very little documented discussion on the merits and weaknesses of approaches to analyzing network telescope data. This thesis is an introductory study in network telescope analysis and aims to consider the variables associated with the data received by network telescopes and how these variables may be analysed. The core research of this thesis considers both novel and previously explored analysis techniques from the fields of security metrics, baseline analysis, statistical analysis and technical analysis as applied to analysing network telescope datasets. These techniques were evaluated as approaches to recognize unusual behaviour by observing the ability of these techniques to identify notable incidents in network telescope dataset

    Hybrid epidemic spreading - from Internet worms to HIV infection

    Get PDF
    Epidemic phenomena are ubiquitous, ranging from infectious diseases, computer viruses, to information dissemination. Epidemics have traditionally been studied as a single spreading process, either in a fully mixed population or on a network. Many epidemics, however, are hybrid, employing more than one spreading mechanism. For example, the Internet worm Conficker spreads locally targeting neighbouring computers in local networks as well as globally by randomly probing any computer on the Internet. This thesis aims to investigate fundamental questions, such as whether a mix of spreading mechanisms gives hybrid epidemics any advantage, and what are the implications for promoting or mitigating such epidemics. We firstly propose a general and simple framework to model hybrid epidemics. Based on theoretical analysis and numerical simulations, we show that properties of hybrid epidemics are critically determined by a hybrid tradeoff, which defines the proportion of resource allocated to local and global spreading mechanisms. We then study two distinct examples of hybrid epidemics: the Internet worm Conficker and the Human Immunodeficiency Virus (HIV) infection within the human body. Using Internet measurement data, we reveal how Conficker combines ineffective spreading mechanisms to produce a serious outbreak on the Internet. We propose a mathematical model that can accurately recapitulate the entire HIV infection course as observed in clinical trials. Our study provides novel insights into the two parallel infection channels of HIV, i.e. cell-free spreading and cell-to-cell spreading, and their joint effect on HIV infection and treatment. In summary, this thesis has advanced our understanding of hybrid epidemics. It has provided mathematical frameworks for future analysis. It has demonstrated, with two successful case studies, that such research can have a significant impact on important issues such as cyberspace security and human health

    An efficient approach to online bot detection based on a reinforcement learning technique

    Get PDF
    In recent years, Botnets have been adopted as a popular method used to carry and spread many malicious codes on the Internet. These codes pave the way to conducting many fraudulent activities, including spam mail, distributed denial of service attacks (DDoS) and click fraud. While many Botnets are set up using a centralized communication architecture such as Internet Relay Chat (IRC) and Hypertext Transfer Protocol (HTTP), peer-to-peer (P2P) Botnets can adopt a decentralized architecture using an overlay network for exchanging command and control (C&C) messages, which is a more resilient and robust communication channel infrastructure. Without a centralized point for C&C servers, P2P Botnets are more flexible to defeat countermeasures and detection procedures than traditional centralized Botnets. Several Botnet detection techniques have been proposed, but Botnet detection is still a very challenging task for the Internet security community because Botnets execute attacks stealthily in the dramatically growing volumes of network traffic. However, current Botnet detection schemes face significant problem of efficiency and adaptability. The present study combined a traffic reduction approach with reinforcement learning (RL) method in order to create an online Bot detection system. The proposed framework adopts the idea of RL to improve the system dynamically over time. In addition, the traffic reduction method is used to set up a lightweight and fast online detection method. Moreover, a host feature based on traffic at the connection-level was designed, which can identify Bot host behaviour. Therefore, the proposed technique can potentially be applied to any encrypted network traffic since it depends only on the information obtained from packets header. Therefore, it does not require Deep Packet Inspection (DPI) and cannot be confused with payload encryption techniques. The network traffic reduction technique reduces packets input to the detection system, but the proposed solution achieves good a detection rate of 98.3% as well as a low false positive rate (FPR) of 0.012% in the online evaluation. Comparison with other techniques on the same dataset shows that our strategy outperforms existing methods. The proposed solution was evaluated and tested using real network traffic datasets to increase the validity of the solution
    corecore