4,444 research outputs found

    Anomalous Payload-Based Network Intrusion Detection

    Get PDF
    We present a payload-based anomaly detector, we call PAYL, for intrusion detection. PAYL models the normal application payload of network traffic in a fully automatic, unsupervised and very efficient fashion. We first compute during a training phase a profile byte frequency distribution and their standard deviation of the application payload flowing to a single host and port. We then use Mahalanobis distance during the detection phase to calculate the similarity of new data against the pre-computed profile. The detector compares this measure against a threshold and generates an alert when the distance of the new input exceeds this threshold. We demonstrate the surprising effectiveness of the method on the 1999 DARPA IDS dataset and a live dataset we collected on the Columbia CS department network. In once case nearly 100% accuracy is achieved with 0.1% false positive rate for port 80 traffic

    Ray-optical refraction with confocal lenslet arrays

    Get PDF
    Two parallel lenslet arrays with focal lengths f1 and f2 that share a common focal plane (that is, which are separated by a distance f1+f2) can refract transmitted light rays according to Snell's law, but with the 'sin's replaced with 'tan's. This is the case for a limited range of input angles and other conditions. Such confocal lenslet arrays can therefore simulate the interface between optical media with different refractive indices, n1 and n2, whereby the ratio η=-f2/f1 plays the role of the refractive-index ratio n2/n1. Suitable choices of focal lengths enable positive and negative refraction. In contrast to Snell's law, which leads to nontrivial geometric imaging by a planar refractive-index interface only for the special case of n1=±n2, the modified refraction law leads to geometric imaging by planar confocal lenslet arrays for any value of η. We illustrate some of the properties of confocal lenslet arrays with images rendered using ray-tracing software

    Evolving high-speed, easy-to-understand network intrusion detection rules with genetic programming

    Get PDF
    Proceeding of: EvoWorkshops 2009: EvoCOMNET, EvoENVIRONMENT, EvoFIN, EvoGAMES, EvoHOT, EvoIASP, EvoINTERACTION, EvoMUSART, EvoNUM, EvoSTOC, EvoTRANSLOG, Tübingen, Germany, April 15-17, 2009An ever-present problem in intrusion detection technology is how to construct the patterns of (good, bad or anomalous) behaviour upon which an engine have to make decisions regarding the nature of the activity observed in a system. This has traditionally been one of the central areas of research in the field, and most of the solutions proposed so far have relied in one way or another upon some form of data mining–with the exception, of course, of human-constructed patterns. In this paper, we explore the use of Genetic Programming (GP) for such a purpose. Our approach is not new in some aspects, as GP has already been partially explored in the past. Here we show that GP can offer at least two advantages over other classical mechanisms: it can produce very lightweight detection rules (something of extreme importance for high-speed networks or resource-constrained applications) and the simplicity of the patterns generated allows to easily understand the semantics of the underlying attack.Publicad

    Effectiveness evaluation of data mining based IDS

    Get PDF
    Proceeding of: 6th Industrial Conference on Data Mining, ICDM 2006, Leipzig, Germany, July 14-15, 2006.Data mining has been widely applied to the problem of Intrusion Detection in computer networks. However, the misconception of the underlying problem has led to out of context results. This paper shows that factors such as the probability of intrusion and the costs of responding to detected intrusions must be taken into account in order to compare the effectiveness of machine learning algorithms over the intrusion detection domain. Furthermore, we show the advantages of combining different detection techniques. Results regarding the well known 1999 KDD dataset are shown.Publicad

    Network Anomaly Classification by Support Vector Classifiers Ensemble and Non-linear Projection Techniques

    Get PDF
    Network anomaly detection is currently a challenge due to the number of different attacks and the number of potential attackers. Intrusion detection systems aim to detect misuses or network anomalies in order to block ports or connections, whereas firewalls act according to a predefined set of rules. However, detecting the specific anomaly provides valuable information about the attacker that may be used to further protect the system, or to react accordingly. This way, detecting network intrusions is a current challenge due to growth of the Internet and the number of potential intruders. In this paper we present an intrusion detection technique using an ensemble of support vector classifiers and dimensionality reduction techniques to generate a set of discriminant features. The results obtained using the NSL-KDD dataset outperforms previously obtained classification rates

    Inhalation exposure methodology.

    Get PDF
    Modern man is being confronted with an ever-increasing inventory of potentially toxic airborne substances. Exposures to these atmospheric contaminants occur in residential and commercial settings, as well as in the workplace. In order to study the toxicity of such materials, a special technology relating to inhalation exposure systems has evolved. The purpose of this paper is to provide a description of the techniques which are used in exposing laboratory subjects to airborne particles and gases. The various modes of inhalation exposure (whole body, head only, nose or mouth only, etc.) are described at length, including the advantages and disadvantages inherent to each mode. Numerous literature citations are included for further reading. Among the topics briefly discussed are the selection of appropriate animal species for toxicological testing, and the types of inhalation studies performed (acute, chronic, etc.)

    Dissolved noble gases and stable isotopes as tracers of preferential fluid flow along faults in the Lower Rhine Embayment, Germany

    Get PDF
    Groundwater in shallow unconsolidated sedimentary aquifers close to the Bornheim fault in the Lower Rhine Embayment (LRE), Germany, has relatively low δ2H and δ18O values in comparison to regional modern groundwater recharge, and 4He concentrations up to 1.7 × 10−4 cm3 (STP) g–1 ± 2.2 % which is approximately four orders of magnitude higher than expected due to solubility equilibrium with the atmosphere. Groundwater age dating based on estimated in situ production and terrigenic flux of helium provides a groundwater residence time of ∼107 years. Although fluid exchange between the deep basal aquifer system and the upper aquifer layers is generally impeded by confining clay layers and lignite, this study’s geochemical data suggest, for the first time, that deep circulating fluids penetrate shallow aquifers in the locality of fault zones, implying  that sub-vertical fluid flow occurs along faults in the LRE. However, large hydraulic-head gradients observed across many faults suggest that they act as barriers to lateral groundwater flow. Therefore, the geochemical data reported here also substantiate a conduit-barrier model of fault-zone hydrogeology in unconsolidated sedimentary deposits, as well as corroborating the concept that faults in unconsolidated aquifer systems can act as loci for hydraulic connectivity between deep and shallow aquifers. The implications of fluid flow along faults in sedimentary basins worldwide are far reaching and of particular concern for carbon capture and storage (CCS) programmes, impacts of deep shale gas recovery for shallow groundwater aquifers, and nuclear waste storage sites where fault zones could act as potential leakage pathways for hazardous fluids
    corecore