85 research outputs found

    DoS and DDoS Attacks: Defense, Detection and Traceback Mechanisms - A Survey

    Get PDF
    Denial of Service (DoS) or Distributed Denial of Service (DDoS) attacks are typically explicit attempts to exhaust victim2019;s bandwidth or disrupt legitimate users2019; access to services. Traditional architecture of internet is vulnerable to DDoS attacks and it provides an opportunity to an attacker to gain access to a large number of compromised computers by exploiting their vulnerabilities to set up attack networks or Botnets. Once attack network or Botnet has been set up, an attacker invokes a large-scale, coordinated attack against one or more targets. Asa result of the continuous evolution of new attacks and ever-increasing range of vulnerable hosts on the internet, many DDoS attack Detection, Prevention and Traceback mechanisms have been proposed, In this paper, we tend to surveyed different types of attacks and techniques of DDoS attacks and their countermeasures. The significance of this paper is that the coverage of many aspects of countering DDoS attacks including detection, defence and mitigation, traceback approaches, open issues and research challenges

    Intelligent Inter and Intra Network Traffic Estimation Technique for DDoS Attack Detection using Fuzzy Rule Sets for QoS Improvement

    Get PDF
    The quality of service of any network has higher dependency at throughput, latency and service completion strategies. In modern communication systems, there are many loopholes, which could be used by some malicious users to perform various network attacks so that the performance of the network is degraded. There are many denial of service when an approach has been discussed towards the problem of network threats, but still suffers the quality of denial of service attack detection. Propose a service-constrained approach learns the network traffic in various ways like the traffic incurred within the network and that comes from external network. The method uses various features like hop count, hop details, payload, TTl, time and so on. To maintain a rule set with fuzzy value where each rule specifies the feature of genuine packet being received. The incoming packet has to meet any of the rules and the attribute of the packet has to lie between the ranges of values in the rule. The proposed method estimates the inter traffic and intra traffic through the routes of the packet being transferred to identify the genuine nature of the packet being received. In addition, the method maintains set of logs where the packet features are stored to compute the legitimate weight of each packet being received. Based on compute inter and intra traffic values the received packets trustworthy is computed to allow or deny the packet. The proposed method increases the accuracy of DDOS attack detection and helps to improve the performance of the network. DOI: 10.17762/ijritcc2321-8169.15085

    Predicting Number of Zombies in DDoS Attacks Using Pace Regression Model

    Get PDF
    A DDoS attacker attempts to disrupt a target, by flooding it with illegitimate packets which are generated from a large number of zombies, usurping its bandwidth and overtaxing it to prevent legitimate inquiries from getting through. This paper reports the evaluation results of proposed approach that is used to predict number of zombies using Pace Regression Model. A relationship is established between number of zombies and observed deviation in sample entropy. Various statistical performance measures, such as R2, CC, SSE, MSE, RMSE, NMSE, η, MAE are used to measure the performance of the regression model. Network topologies similar to Internet used for simulation are generated using Transit-Stub model of GT-ITM topology generator. NS-2 network simulator on Linux platform is used as simulation test bed for launching DDoS attacks with varied number of zombies. The simulation results are promising as we are able to predict number of zombies efficiently using Pace Regression Model with considerably less error rate

    Botnet-based Distributed Denial of Service (DDoS) Attacks on Web Servers: Classification and Art

    Full text link
    Botnets are prevailing mechanisms for the facilitation of the distributed denial of service (DDoS) attacks on computer networks or applications. Currently, Botnet-based DDoS attacks on the application layer are latest and most problematic trends in network security threats. Botnet-based DDoS attacks on the application layer limits resources, curtails revenue, and yields customer dissatisfaction, among others. DDoS attacks are among the most difficult problems to resolve online, especially, when the target is the Web server. In this paper, we present a comprehensive study to show the danger of Botnet-based DDoS attacks on application layer, especially on the Web server and the increased incidents of such attacks that has evidently increased recently. Botnet-based DDoS attacks incidents and revenue losses of famous companies and government websites are also described. This provides better understanding of the problem, current solution space, and future research scope to defend against such attacks efficiently

    DDoS: DeepDefence and Machine Learning for identifying attacks

    Get PDF
    Distributed Denial of Service (DDoS) attacks are very common type of computer attack in the world of internet today. Automatically detecting such type of DDoS attack packets & dropping them before passing through the network is the best prevention method. Conventional solution only monitors and provide the feedforward solution instead of the feedback machine-based learning. A Design of Deep neural network has been suggested in this work and developments have been made on proactive detection of attacks. In this approach, high level features are extracted for representation and inference of the dataset. Experiment has been conducted based on the ISCX dataset published in year 2017,2018 and CICDDoS2019 and program has been developed in Matlab R17b, utilizing Wireshark for features extraction from the datasets. Network Intrusion attacks on critical oil and gas industrial installation become common nowadays, which in turn bring down the giant industrial sites to standstill and suffer financial impacts. This has made the production companies to started investing millions of dollars revenue to protect their critical infrastructure with such attacks with the active and passive solutions available. Our thesis constitutes a contribution to such domain, focusing mainly on security of industrial network, impersonation and attacking with DDoS

    Multi-layer Perceptron Model for Mitigating Distributed Denial of Service Flood Attack in Internet Kiosk Based Electronic Voting

    Get PDF
    Distributed Denial-of-Service (DDoS) flood attack targeting an Internet Kiosk voting environment can deprive voters from casting their ballots in a timely manner. The goal of the DDoS flood attack is to make voting server unavailable to voters during election process. In this paper, we present a Multilayer Perceptron (MLP) algorithm to mitigate DDoS flood attack in an e-voting environment and prevent such attack from disrupting availability of the vulnerable voting server. The developed intelligent DDoS flood mitigation model based on MLP Technique was simulated in MATLAB R2017a. The mitigation model was evaluated using server utilization performance metrics in e-voting. The results after the introduction of the developed mitigation model into the DDoS attack model reduced the server utilization from 1 to 0.4 indicating normal traffic. MLP showed an accuracy of 95% in mitigating DDoS flood attacks providing availability of voting server resources for convenient and timely casting of ballots as well as provide for credible delivery of electronic democratic decision making

    Detection of unsolicited web browsing with clustering and statistical analysis

    Get PDF
    Unsolicited web browsing denotes illegitimate accessing or processing web content. The harmful activity varies from extracting e-mail information to downloading entire website for duplication. In addition, computer criminals prevent legitimate users from gaining access to websites by implementing a denial of service attack with high-volume legitimate traffic. These offences are accomplished by preprogrammed machines that avoid rate-dependent intrusion detection systems. Therefore, it is assumed in this thesis that the only difference between a legitimate and malicious web session is in the intention rather than physical characteristics or network-layer information. As a result, the main aim of this research has been to provide a method of malicious intention detection. This has been accomplished by two-fold process. Initially, to discover most recent and popular transitions of lawful users, a clustering method has been introduced based on entropy minimisation. In principle, by following popular transitions among the web objects, the legitimate users are placed in low-entropy clusters, as opposed to the undesired hosts whose transitions are uncommon, and lead to placement in high-entropy clusters. In addition, by comparing distributions of sequences of requests generated by the actual and malicious users across the clusters, it is possible to discover whether or not a website is under attack. Secondly, a set of statistical measurements have been tested to detect the actual intention of browsing hosts. The intention classification based on Bayes factors and likelihood analysis have provided the best results. The combined approach has been validated against actual web traces (i.e. datasets), and generated promising results

    After the Gold Rush: The Boom of the Internet of Things, and the Busts of Data-Security and Privacy

    Get PDF
    This Article addresses the impact that the lack of oversight of the Internet of Things has on digital privacy. While the Internet of Things is but one vehicle for technological innovation, it has created a broad glimpse into domestic life, thus triggering several privacy issues that the law is attempting to keep pace with. What the Internet of Things can reveal is beyond the control of the individual, as it collects information about every practical aspect of an individual’s life, and provides essentially unfettered access into the mind of its users. This Article proposes that the federal government and the state governments bend toward consumer protection while creating a cogent and predictable body of law surrounding the Internet of Things. Through privacy-by-design or self-help, it is imperative that the Internet of Things—and any of its unforeseen progeny—develop with an eye toward safeguarding individual privacy while allowing technological development
    • …
    corecore