19 research outputs found

    Hybrid Approach for Botnet Detection Using K-Means and K-Medoids with Hopfield Neural Network

    Get PDF
    In the last few years, a number of attacks and malicious activities have been attributed to common channels between users. A botnet is considered as an important carrier of malicious and undesirable briskness. In this paper, we propose a support vector machine to classify botnet activities according to k-means, k-medoids, and neural network clusters. The proposed approach is based on the features of transfer control protocol packets. System performance and accuracy are evaluated using a predefined data set. Results show the ability of the proposed approach to detect botnet activities with high accuracy and performance in a short execution time. The proposed system provides 95.7% accuracy rate with a false positive rate less than or equal to 3%

    On Understanding the Existence of a Deep Torrent

    Get PDF
    Nowadays, a great part of the Internet content is not reachable from search engines. Studying the nature of these contents from a cyber security perspective is of a high interest, as they could be part of many malware distribution processes, child pornography or copyrighted material exchange, botnet command and control messages, etc. Although the research community has put a big effort on this challenge, most of the existing works are focused on contents that are hidden in Web sites. Yet, there exist other relevant services that are used to keep and transmit hidden resources, such as P2P protocols. In the present work, we suggest the concept of Deep Torrent to refer to those torrents available in BitTorrent that cannot be found by means of public Web sites or search engines. We present an implementation of a complete system to crawl the Deep Torrent and evaluate its existence and size. We describe a basic experiment crawling the Deep Torrent for 39 days, in which an initial estimation of its size is 67% of the total number of resources shared in BitTorrent network

    Analytical Lifecycle Modeling and Threat Analysis of Botnets

    Get PDF
    Botnet, which is an overlay network of compromised computers built by cybercriminals known as botmasters, is the new phenomenon that has caused deep concerns to the security professionals responsible for governmental, academic, and private sector networks. Botmasters use a plethora of methods to infect network-accessible devices (nodes). The initial malware residing on these nodes then either connects to a central Command & Control (C&C) server or joins a Peer-to-Peer (P2P) botnet. At this point, the nodes can receive the commands of the botmaster and proceed to engage in illicit activities such as Distributed Denial-of-Service (DDoS) attacks and massive e-mail spam campaigns. Being able to reliably estimate the size of a botnet is an important task which allows the adequate deployment of mitigation strategies against the botnet. In this thesis, we develop analytical models that capture the botnet expansion and size evolution behaviors in sufficient details so as to accomplish this crucial estimation/analysis task. We develop four Continuous-Time Markov Chain (CTMC) botnet models: the first two, SComI and SComF, allow the prediction of initial unhindered botnet expansion in the case of infinite and finite population sizes, respectively. The third model, the SIC model, is a botnet lifecycle model which accounts for all important node stages and allows botnet size estimates as well as evaluation of botnet mitigation strategies such as disinfections of nodes and attacks on botnet's C&C mechanism. Finally, the fourth model, the SIC-P2P model, is an extension of the SIC model suitable for P2P botnets, allowing fine-grained analysis of mitigation strategies such as index poisoning and sybil attack. As the convergence of Internet and traditional telecommunication services is underway, the threat of botnets is looming over essential basic communication services. As the last contribution presented in this thesis, we analyze the threat of botnets in the 4G cellular wireless networks. We identify the vulnerability of the air interface, i.e. the Long Term Evolution (LTE), which allows a successful botnet-launched DDoS attack against it. Through simulation using an LTE simulator, we determine the number of botnet nodes per cell that can significantly degrade the service availability of such cellular networks

    Increasing chances of survival for malware using theory of natural selection and the selfish gene

    Get PDF
    Malware, short for malicious software, is used as a general term for computer viruses, Trojan horses, worms, and other harmful software or code. Malware authors try to obfuscate their code in order to evade antiviral programs. Different analysis techniques are used by antiviral programs in order to detect different encryption and obfuscation methods. Survivability of malware becomes the main concern for an attacker since the malware should usually be able to spread to other computers; use resources of victim's computer; and create new copies of itself. In this thesis, inspired by Darwin's theory of natural selection and the selfish gene concept explained by Richard Dawkins, we propose novel methods which increase the chance of survivability for malware. We implement selfishness, altruistic behavior, mimicry, group selection, and similar behavior models into our experimental malware and we also test our techniques against existing solutions. We develop tools in order to enhance existing malware with features presented in this thesis. Effectiveness of proposed techniques are presented and an experimental test is carried out with a dataset containing more than 300.000 malware samples. Group behavior models are also introduced and methods proposed for enhancing botnets to have better stability (Evolutionarily stable botnet)

    A structured approach to malware detection and analysis in digital forensics investigation

    Get PDF
    A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirement for the degree of PhDWithin the World Wide Web (WWW), malware is considered one of the most serious threats to system security with complex system issues caused by malware and spam. Networks and systems can be accessed and compromised by various types of malware, such as viruses, worms, Trojans, botnet and rootkits, which compromise systems through coordinated attacks. Malware often uses anti-forensic techniques to avoid detection and investigation. Moreover, the results of investigating such attacks are often ineffective and can create barriers for obtaining clear evidence due to the lack of sufficient tools and the immaturity of forensics methodology. This research addressed various complexities faced by investigators in the detection and analysis of malware. In this thesis, the author identified the need for a new approach towards malware detection that focuses on a robust framework, and proposed a solution based on an extensive literature review and market research analysis. The literature review focussed on the different trials and techniques in malware detection to identify the parameters for developing a solution design, while market research was carried out to understand the precise nature of the current problem. The author termed the new approaches and development of the new framework the triple-tier centralised online real-time environment (tri-CORE) malware analysis (TCMA). The tiers come from three distinctive phases of detection and analysis where the entire research pattern is divided into three different domains. The tiers are the malware acquisition function, detection and analysis, and the database operational function. This framework design will contribute to the field of computer forensics by making the investigative process more effective and efficient. By integrating a hybrid method for malware detection, associated limitations with both static and dynamic methods are eliminated. This aids forensics experts with carrying out quick, investigatory processes to detect the behaviour of the malware and its related elements. The proposed framework will help to ensure system confidentiality, integrity, availability and accountability. The current research also focussed on a prototype (artefact) that was developed in favour of a different approach in digital forensics and malware detection methods. As such, a new Toolkit was designed and implemented, which is based on a simple architectural structure and built from open source software that can help investigators develop the skills to critically respond to current cyber incidents and analyses

    Computer Science & Technology Series : XVIII Argentine Congress of Computer Science. Selected papers

    Get PDF
    CACIC’12 was the eighteenth Congress in the CACIC series. It was organized by the School of Computer Science and Engineering at the Universidad Nacional del Sur. The Congress included 13 Workshops with 178 accepted papers, 5 Conferences, 2 invited tutorials, different meetings related with Computer Science Education (Professors, PhD students, Curricula) and an International School with 5 courses. CACIC 2012 was organized following the traditional Congress format, with 13 Workshops covering a diversity of dimensions of Computer Science Research. Each topic was supervised by a committee of 3-5 chairs of different Universities. The call for papers attracted a total of 302 submissions. An average of 2.5 review reports were collected for each paper, for a grand total of 752 review reports that involved about 410 different reviewers. A total of 178 full papers, involving 496 authors and 83 Universities, were accepted and 27 of them were selected for this book.Red de Universidades con Carreras en Informática (RedUNCI

    Networks, complexity and internet regulation: scale-free law

    Get PDF
    No description supplie

    Propagation, Detection and Containment of Mobile Malware.

    Full text link
    Today's enterprise systems and networks are frequent targets of malicious attacks, such as worms, viruses, spyware and intrusions that can disrupt, or even disable critical services. Recent trends suggest that by combining spyware as a malicious payload with worms as a delivery mechanism, malicious programs can potentially be used for industrial espionage and identity theft. The problem is compounded further by the increasing convergence of wired, wireless and cellular networks, since virus writers can now write malware that can crossover from one network segment to another, exploiting services and vulnerabilities specific to each network. This dissertation makes four primary contributions. First, it builds more accurate malware propagation models for emerging hybrid malware (i.e., malware that use multiple propagation vectors such as Bluetooth, Email, Peer-to-Peer, Instant Messaging, etc.), addressing key propagation factors such as heterogeneity of nodes, services and user mobility within the network. Second, it develops a proactive containment framework based on group-behavior of hosts against such malicious agents in an enterprise setting. The majority of today's anti-virus solutions are reactive, i.e., these are activated only after a malicious activity has been detected at a node in the network. In contrast, proactive containment has the potential of closing the vulnerable services ahead of infection, and thereby halting the spread of the malware. Third, we study (1) the current-generation mobile viruses and worms that target SMS/MMS messaging and Bluetooth on handsets, and the corresponding exploits, and (2) their potential impact in a large SMS provider network using real-life SMS network data. Finally, we propose a new behavioral approach for detecting emerging malware targeting mobile handsets. Our approach is based on the concept of generalized behavioral patterns instead of traditional signature-based detection. The signature-based methods are not scalable for deployment in mobile devices due to limited resources available on today's typical handsets. Further, we demonstrate that the behavioral approach not only has a compact footprint, but also can detect new classes of malware that combine some features from existing classes of malware.Ph.D.Computer Science & EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/60849/1/abose_1.pd

    Networks, complexity and internet regulation scale-free law

    Get PDF
    This book, then, starts with a general statement: that regulators should try, wherever possible, to use the physical methodological tools presently available in order to draft better legislation. While such an assertion may be applied to the law in general, this work will concentrate on the much narrower area of Internet regulation and the science of complex networks The Internet is the subject of this book not only because it is my main area of research, but also because –without over-emphasising the importance of the Internet to everyday life– one cannot deny that the growth and popularisation of the global communications network has had a tremendous impact on the way in which we interact with one another. The Internet is, however, just one of many interactive networks. One way of looking at the complex and chaotic nature of society is to see it as a collection of different nodes of interaction. Humans are constantly surrounded by networks: the social network, the financial network, the transport network, the telecommunications network and even the network of our own bodies. Understanding how these systems operate and interact with one another has been the realm of physicists, economists, biologists and mathematicians. Until recently, the study of networks has been mainly theoretical and academic, because it is difficult to gather data about large and complex systems that is sufficiently reliable to support proper empirical application. In recent years, though, the Internet has given researchers the opportunity to study and test the mathematical descriptions of these vast complex systems. The growth rate and structure of cyberspace has allowed researchers to map and test several previously unproven theories about how links and hubs within networks interact with one another. The Web now provides the means with which to test the organisational structures, architecture and growth of networks, and even permits some limited prediction about their behaviour, strengths and vulnerabilities. The main objective of this book is first and foremost to serve as an introduction to the wider legal audience to some of the theories of complexity and networks. The second objective is more ambitious. By looking at the application of complexity theory and network science in various areas of Internet regulation, it is hoped that there will be enough evidence to postulate a theory of Internet regulation based on network science. To achieve these two goals, Chapter 2 will look in detail at the science of complex networks to set the stage for the legal and regulatory arguments to follow. With the increase in reliability of the descriptive (and sometimes predictive) nature of network science, a logical next step for legal scholars is to look at the legal implications of the characteristics of networks. Chapter 3 highlights the efforts of academics and practitioners who have started to find potential uses for network science tools. Chapter 4 takes this idea further, and explores how network theory can shape Internet regulation. The following chapters will analyse the potential for application of the tools described in the previous chapters, applying complexity theory to specific areas of study related to Internet Law. Chapter 5 deals with the subject of copyright in the digital world. Chapter 6 explores the issue of peer-production and user-generated content using network science as an analytical framework. Chapter 7 finishes the evidence section of the work by studying the impact of network architecture in the field of cybercrime, and asks whether the existing architecture hinders or assists efforts to tackle those problems. It is clear that these are very disparate areas of study. It is not the intention of this book to be overreaching in its scope, although I am mindful that it covers a lot of ground and attempts to study and describe some disciplines that fall outside of my intellectual comfort zone. While the focus of the work is the Internet, its applications may extend beyond mere electronic bits. Without trying to be over-ambitious, it is my strong belief that legal scholarship has been neglectful in that it has been slow to respond to the wealth of research into complexity. That is not to say that there has been no legal research on the topic, but it would seem that lawyers, legislators and policy-makers are reluctant to consider technical solutions to legal problems. It is hoped then that this work will serve as a stepping stone that will lead to new interest in some of the theories that I describe

    Contribuciones para la DetecciĂłn de Ataques Distribuidos de DenegaciĂłn de Servicio (DDoS) en la Capa de AplicaciĂłn

    Get PDF
    Se analizaron seis aspectos sobre la detecciĂłn de ataques DDoS: tĂ©cnicas, variables, herramientas, ubicaciĂłn de implementaciĂłn, punto en el tiempo y precisiĂłn de detecciĂłn. Este anĂĄlisis permitiĂł realizar una contribuciĂłn Ăștil al diseño de una estrategia adecuada para neutralizar estos ataques. En los Ășltimos años, estos ataques se han dirigido hacia la capa de aplicaciĂłn. Este fenĂłmeno se debe principalmente a la gran cantidad de herramientas para la generaciĂłn de este tipo de ataque. Por ello, ademĂĄs, en este trabajo se propone una alternativa de detecciĂłn basada en el dinamismo del usuario web. Para esto, se evaluaron las caracterĂ­sticas del dinamismo del usuario extraĂ­das de las funciones del mouse y del teclado. Finalmente, el presente trabajo propone un enfoque de detecciĂłn de bajo costo que consta de dos pasos: primero, las caracterĂ­sticas del usuario se extraen en tiempo real mientras se navega por la aplicaciĂłn web; en segundo lugar, cada caracterĂ­stica extraĂ­da es utilizada por un algoritmo de orden (O1) para diferenciar a un usuario real de un ataque DDoS. Los resultados de las pruebas con las herramientas de ataque LOIC, OWASP y GoldenEye muestran que el mĂ©todo propuesto tiene una eficacia de detecciĂłn del 100% y que las caracterĂ­sticas del dinamismo del usuario de la web permiten diferenciar entre un usuario real y un robot
    corecore