22 research outputs found

    Engineering Model-Based Adaptive Software Systems

    Get PDF
    Adaptive software systems are able to cope with changes in the environment by self-adjusting their structure and behavior. Robustness refers to the ability of the systems to deal with uncertainty, i.e. perturbations (e.g., Denial of Service attacks) or not-modeled system dynamics (e.g., independent cloud applications hosted on the same physical machine) that can affect the quality of the adaptation. To build robust adaptive systems we need models that accurately describe the managed system and methods for how to react to different types of change. In this thesis we introduce techniques that will help an engineer design adaptive systems for web applications. We describe methods to accurately model web applications deployed in cloud in such a way that it accounts for cloud variability and to keep the model synchronized with the actual system at runtime. Using the model, we present methods to optimize the deployed architecture at design- and run-time, uncover bottlenecks and the workloads that saturate them, maintain the service level objective by changing the quantity of available resources (for regular operating conditions or during a Denial of Service attack). We validate the proposed contributions on experiments performed on Amazon EC2 and simulators. The types of applications that benefit the most from our contributions are web-based information systems deployed in cloud

    Security Configuration Management in Intrusion Detection and Prevention Systems

    Get PDF
    Intrusion Detection and/or Prevention Systems (IDPS) represent an important line of defense against a variety of attacks that can compromise the security and proper functioning of an enterprise information system. IDPSs can be network or host-based and can collaborate in order to provide better detection of malicious traffic. Although several IDPS systems have been proposed, their appropriate con figuration and control for e effective detection/ prevention of attacks and efficient resource consumption is still far from trivial. Another concern is related to the slowing down of system performance when maximum security is applied, hence the need to trade o between security enforcement levels and the performance and usability of an enterprise information system. In this dissertation, we present a security management framework for the configuration and control of the security enforcement mechanisms of an enterprise information system. The approach leverages the dynamic adaptation of security measures based on the assessment of system vulnerability and threat prediction, and provides several levels of attack containment. Furthermore, we study the impact of security enforcement levels on the performance and usability of an enterprise information system. In particular, we analyze the impact of an IDPS con figuration on the resulting security of the network, and on the network performance. We also analyze the performance of the IDPS for different con figurations and under different traffic characteristics. The analysis can then be used to predict the impact of a given security con figuration on the prediction of the impact on network performance

    A Novel Puzzle-Based Framework for Mitigating Distributed Denial of Service Attacks Against Internet Applications

    Get PDF
    Cryptographic puzzles are promising techniques for mitigating DDoS attacks via decreasing the incoming rate of service eligible requests. However, existing cryptographic puzzle techniques have several shortcomings that make them less appealing as a tool of choice for DDoS defense. These shortcomings include: (1) the lack of accurate models for dynamically determining puzzle hardness; (2) the lack of an efficient and effective counter mechanism for puzzle solution replay attacks; and (3) the wastefulness of the puzzle computations in terms of the clients' computational resources. In this thesis, we provide a puzzle based DDoS defense framework that addresses these shortcomings. Our puzzle framework includes three novel puzzle mechanisms. The first mechanism, called Puzzle+, provides a mathematical model of per-request puzzle hardness. Through extensive experimental study, we show that this model optimizes the effectiveness of puzzle based DDoS mitigation while enabling tight control over the server utilization. In addition, Puzzle+ disables puzzle solution replay attacks by utilizing a novel cache algorithm to detect replays. The second puzzle mechanism, called Productive Puzzles, alleviates the wastefulness of computational puzzles by transforming the puzzle computations into computations of meaningful tasks that provide utility. Our third puzzle mechanism, called Guided Tour Puzzles, eliminates the wasteful puzzle computations all together, and adopts a novel delay-based puzzle construction idea. In addition, it is not affected by the disparity in the computational resources of the client machines that perform the puzzle computations. Through measurement analysis on real network testbeds as well as extensive simulation study, we show that both Productive Puzzles and Guided Tour Puzzles achieve effective mitigation of DDoS attacks while satisfying no wasteful computation requirement. Lastly, we introduce a novel queue management algorithm, called Stochastic Fair Drop Queue (SFDQ), to further strengthen the DDoS protection provided by the puzzle framework. SFDQ is not only effective against DDoS attacks at multiple layers of the protocol stack, it is also simple to configure and deploy. SFDQ is implemented over a novel data structure, called Indexed Linked List, to provide enqueue, dequeue, and remove operations with O(1) time complexity

    Features extraction using random matrix theory.

    Get PDF
    Representing the complex data in a concise and accurate way is a special stage in data mining methodology. Redundant and noisy data affects generalization power of any classification algorithm, undermines the results of any clustering algorithm and finally encumbers the monitoring of large dynamic systems. This work provides several efficient approaches to all aforementioned sides of the analysis. We established, that notable difference can be made, if the results from the theory of ensembles of random matrices are employed. Particularly important result of our study is a discovered family of methods based on projecting the data set on different subsets of the correlation spectrum. Generally, we start with traditional correlation matrix of a given data set. We perform singular value decomposition, and establish boundaries between essential and unimportant eigen-components of the spectrum. Then, depending on the nature of the problem at hand we either use former or later part for the projection purpose. Projecting the spectrum of interest is a common technique in linear and non-linear spectral methods such as Principal Component Analysis, Independent Component Analysis and Kernel Principal Component Analysis. Usually the part of the spectrum to project is defined by the amount of variance of overall data or feature space in non-linear case. The applicability of these spectral methods is limited by the assumption that larger variance has important dynamics, i.e. if the data has a high signal-to-noise ratio. If it is true, projection of principal components targets two problems in data mining, reduction in the number of features and selection of more important features. Our methodology does not make an assumption of high signal-to-noise ratio, instead, using the rigorous instruments of Random Matrix Theory (RNIT) it identifies the presence of noise and establishes its boundaries. The knowledge of the structure of the spectrum gives us possibility to make more insightful projections. For instance, in the application to router network traffic, the reconstruction error procedure for anomaly detection is based on the projection of noisy part of the spectrum. Whereas, in bioinformatics application of clustering the different types of leukemia, implicit denoising of the correlation matrix is achieved by decomposing the spectrum to random and non-random parts. For temporal high dimensional data, spectrum and eigenvectors of its correlation matrix is another representation of the data. Thus, eigenvalues, components of the eigenvectors, inverse participation ratio of eigenvector components and other operators of eigen analysis are spectral features of dynamic system. In our work we proposed to extract spectral features using the RMT. We demonstrated that with extracted spectral features we can monitor the changing dynamics of network traffic. Experimenting with the delayed correlation matrices of network traffic and extracting its spectral features, we visualized the delayed processes in the system. We demonstrated in our work that broad range of applications in feature extraction can benefit from the novel RMT based approach to the spectral representation of the data

    Identifying Malicious Activities in Honeynets using Clustering

    Get PDF
    corecore