14 research outputs found

    An enhancement of classification technique based on rough set theory for intrusion detection system application

    Get PDF
    An Intrusion Detection System (IDS) is capable to detect unauthorized intrusions into computer systems and networks by looking for signatures of known attacks or deviations of normal activity. However, accuracy performance is one of the issues in IDS application. Meanwhile, classification is one of techniques in data mining employed to increase IDS performance. In order to improve classification performance problem, feature selection and discretization algorithm are crucial in selecting relevant attributes that could improve classification performance. Discretization algorithms have been recently proposed; however, those algorithms of discretizer are only capable to handle categorical attributes and cannot deal with numerical attributes. In fact, it is difficult to determine the needed number of intervals and their width. Thus, to deal with huge dataset, data mining technique can be improved by introducing discretization algorithm to increase classification performance. The generation of rule is considered a crucial process in data mining and the generated rules are in a huge number. Therefore,it is dreadful to determine important and relevant rules for the next process . As a result, the aim of the study is to improve classification performance in terms of accuracy, detection rate and false positive alarm rate decreased for IDS application. Henceforth, to achieve the aim, current research work proposed an enhancement of discretization algorithm based on Binning Discretization in RST to improve classification performance and to enhance the strategy of generation rules in RST to improve classification performance. Both enhancements were evaluated in terms of accuracy, false positive alarm and detection rate against state-of-the-practice dataset (KDD Cup 99 dataset) in IDS application. Several discretization algorithms such Equal Frequency Binning, Entropy/MDL, Naïve and proposed discretization were analysed and compared in the study. Experimental results show the proposed technique increases accuracy classification percentage up to 99.95%; and the minimum number of bins determine good discretization algorithm. Consequently, attack detection rate increases and false positive alarm rate minimizes. In particular, the proposed algorithm obtains satisfactory compromise between the number of cuts and classification accuracy

    Rough Set Discretize Classification of Intrusion Detection System

    Get PDF
    Many pattern classification tasks confront with the problem that may have a very high dimensional feature space like in Intrusion Detection System (IDS) data. Rough set is widely used in IDS to overcome the arising issue. In rough set, there are several stage processing including discretization part which is a vital task in data mining, particularly in the classification problem. Two results distinguish showing that the discretization stage is hugely important in both training and testing of IDS data. In proposed framework, analysis should been done to discretization, reduct and rules stage to determine the significant algorithm and core element in IDS data. The classification using standard voting, since it is a rule-based classification

    Measuring the Role of Satisfaction in Website Usability Model

    Get PDF
    Usability is about making website as user-friendly as possible. There are many elements for website usability such as learnability, navigation, interface, efficiency and much more. Most of the previous researches list the elements of website usability and the model for website usability in hierarchical models. Therefore, it makes the model of website usability become complex because it consists many elements. This paper explores existing studies on website usability model to identify the elements for website usability model from previous studies. In the study, the website usability construct are analysed as second order construct. Many studies have proposed website usability models that include the satisfaction. Thus, this study also validates the website usability model and analysis the mediating roles of satisfaction in the relationship between website usability and intention to use. The study identified effectiveness, efficiency, learnability, navigation, content, interface design and accessibility as element

    How to Cultivate Cyber Security Culture? The Evidences from Literature

    Get PDF
    Cyber Security Culture (CSC) is a culture that could produce a secure cyber space and could improve the quality of cyber world engagement. Despite many benefits that could be offered by CSC, there is a lack of models and guidelines on how to cultivate this culture. This paper discusses the concept of CSC model in terms of elements that form the model to suggest how CSC could be cultivated. Information Security Culture (ISC) model developed by [1] is used as a framework in discussing the concept of CSC. A literature search also is conducted to find and analyses the most suitable elements for CSC. A new model of CSC was proposed as a result of this study. The findings could provide better understanding of CSC and could be used as baseline to conduct more research on CSC

    A Review on Soft Computing Technique in Intrusion Detection System

    Get PDF
    Intrusion Detection System is significant in network security. It detects and identifies intrusion behavior or intrusion attempts in a computer system by monitoring and analyzing the network packets in real time. In the recent year, intelligent algorithms applied in the intrusion detection system (IDS) have been an increasing concern with the rapid growth of the network security. IDS data deals with a huge amount of data which contains irrelevant and redundant features causing slow training and testing process, higher resource consumption as well as poor detection rate. Since the amount of audit data that an IDS needs to examine is very large even for a small network, classification by hand is impossible. Hence, the primary objective of this review is to review the techniques prior to classification process suit to IDS data

    Rough set discretization: equal frequency binning, entropy/MDL and semi naives algorithms of intrusion detection system

    Get PDF
    Discretization of real value attributes is a vital task in data mining, particularly in the classification problem. Discretization part is also the crucial part resulting the good classification. Empirical results have shown that the quality of classification methods depends on the discretization algorithm in preprocessing step. Universally, discretization is a process of searching for partition of attribute domains into intervals and unifying the values over each interval. Significant discretization technique suit to the Intrusion Detection System (IDS) data need to determine in IDS framework, since IDS data consist of huge records that need to be examined in system. There are many Rough Set discretization technique that can be used, among of them are Semi Naives and Equal Frequency Binning

    Rough Set Significant Reduct and Rules of Intrusion Detection System

    Get PDF
    Intrusion Detection System deals with huge amount of data which contains irrelevant and redundant features causing slow training and testing process, also higher resource consumption as well as poor detection rate. It is not simply removing these irrelevant or redundant features due to deteriorate the performance of classifiers. Furthermore, by choosing the effective and important features, the classification mode and the classification performance will be improved. Rough Set is the most widely used as a baseline technique of single classifier approach on intrusion detection system. Typically, Rough Set is an efficient instrument in dealing with huge dataset in concert with missing values and granularing the features. However, large numbers of generated features reducts and rules must be chosen cautiously to reduce the processing power in dealing with massive parameters for classification. Hence, the primary objective of this study is to probe the significant reducts and rules prior to classification process of Intrusion Detection System. All embracing analyses are presented to eradicate the insignificant attributes, reduct and rules for better classification taxonomy. Reducts with core attributes and minimal cardinality are preferred to construct new decision table, and subsequently generate high classification rates. In addition, rules with highest support, fewer length, high Rule Importance Measure (RIM) and high coverage rule are favored since they reveal high quality performance. The results are compared in terms of the classification accuracy between the original decision table and a new decision table

    Rough set discretization: Equal frequency binning, entropy/MDL and semi naives algorithms of intrusion detection system

    Get PDF
    Discretization of real value attributes is a vital task in data mining, particularly in the classification problem. Discretization part is also the crucial part resulting the good classification. Empirical results have shown that the quality of classification methods depends on the discretization algorithm in preprocessing step. Universally, discretization is a process of searching for partition of attribute domains into intervals and unifying the values over each interval. Significant discretization technique suit to the Intrusion Detection System (IDS) data need to determine in IDS framework, since IDS data consist of huge records that need to be examined in system. There are many Rough Set discretization technique that can be used, among of them are Semi Naives and Equal Frequency Binning

    Rough Set Significant Reduction and Rules of Intrusion Detection System

    Get PDF
    Intrusion Detection System deals with huge amount of data which contains irrelevant and redundant features causing slow training and testing process, also higher resource consumption as well as poor detection rate. It is not simply removing these irrelevant or redundant features due to deteriorate the performance of classifiers. Furthermore, by choosing the effective and important features, the classification mode and the classification performance will be improved. Rough Set is the most widely used as a baseline technique of single classifier approach on intrusion detection system. Typically, Rough Set is an efficient instrument in dealing with huge dataset in concert with missing values and granularing the features. However, large numbers of generated features reducts and rules must be chosen cautiously to reduce the processing power in dealing with massive parameters for classification. Hence, the primary objective of this study is to probe the significant reducts and rules prior to classification process of Intrusion Detection System. All embracing analyses are presented to eradicate the insignificant attributes, reduct and rules for better classification taxonomy. Reducts with core attributes and minimal cardinality are preferred to construct new decision table, and subsequently generate high classification rates. In addition, rules with highest support, fewer length, high Rule Importance Measure (RIM) and high coverage rule are favored since they reveal high quality performance. The results are compared in terms of the classification accuracy between the original decision table and a new decision table
    corecore