135,542 research outputs found

    Threshold Verification Technique for Network Intrusion Detection System

    Get PDF
    Internet has played a vital role in this modern world, the possibilities and opportunities offered are limitless. Despite all the hype, Internet services are liable to intrusion attack that could tamper the confidentiality and integrity of important information. An attack started with gathering the information of the attack target, this gathering of information activity can be done as either fast or slow attack. The defensive measure network administrator can take to overcome this liability is by introducing Intrusion Detection Systems (IDSs) in their network. IDS have the capabilities to analyze the network traffic and recognize incoming and on-going intrusion. Unfortunately the combination of both modules in real time network traffic slowed down the detection process. In real time network, early detection of fast attack can prevent any further attack and reduce the unauthorized access on the targeted machine. The suitable set of feature selection and the correct threshold value, add an extra advantage for IDS to detect anomalies in the network. Therefore this paper discusses a new technique for selecting static threshold value from a minimum standard features in detecting fast attack from the victim perspective. In order to increase the confidence of the threshold value the result is verified using Statistical Process Control (SPC). The implementation of this approach shows that the threshold selected is suitable for identifying the fast attack in real time.Comment: 8 Pages, International Journal of Computer Science and Information Securit

    Indicators and methods for assessing the quality of logistic activity processes

    Get PDF
    Purpose: This article is aimed at identifying and evaluating the quality and safety indicators of processes in the logistics system and solving the problems of product control in the goods’ distribution process. Design/Methodology/Approach: In order to assess the risks and quality of control methods in the goods’ distribution processes, studies were carried out in the process of grain supply, on which the risk assessment was tested using the fault tree using a qualitative approach with a deductive logic, which allowed to identify events at the lower levels of the system. To evaluate the results when comparing various methods of monitoring the characteristics of products in the product distribution process certain statistical tools were used. The evaluation with comparative tests is required in order to determine the way of measuring products in the goods distribution logistics system. The study uses the methods of formalization, analysis, measurement, experimental and comparison. Findings: The considered risk assessment method and the given example allow us to recommend its use for the product distribution processes for various purposes. A technique is proposed for comparing various control methods based on statistical tools that can be recommended for various goods’ distribution operations. Practical implications: The results of the study can be applied in practice to improve the quality of goods’ distribution processes and reduce risks in the various supply chains. Originality/value: The main contribution of this study is to shift the emphasis on the assessment of processes in goods’ distribution to the positions of a risk-based approach and the use of various statistical tools in logistics’ activities.peer-reviewe

    Improving Knowledge-Based Systems with statistical techniques, text mining, and neural networks for non-technical loss detection

    Get PDF
    Currently, power distribution companies have several problems that are related to energy losses. For example, the energy used might not be billed due to illegal manipulation or a breakdown in the customer’s measurement equipment. These types of losses are called non-technical losses (NTLs), and these losses are usually greater than the losses that are due to the distribution infrastructure (technical losses). Traditionally, a large number of studies have used data mining to detect NTLs, but to the best of our knowledge, there are no studies that involve the use of a Knowledge-Based System (KBS) that is created based on the knowledge and expertise of the inspectors. In the present study, a KBS was built that is based on the knowledge and expertise of the inspectors and that uses text mining, neural networks, and statistical techniques for the detection of NTLs. Text mining, neural networks, and statistical techniques were used to extract information from samples, and this information was translated into rules, which were joined to the rules that were generated by the knowledge of the inspectors. This system was tested with real samples that were extracted from Endesa databases. Endesa is one of the most important distribution companies in Spain, and it plays an important role in international markets in both Europe and South America, having more than 73 million customers

    Using Ontologies for the Design of Data Warehouses

    Get PDF
    Obtaining an implementation of a data warehouse is a complex task that forces designers to acquire wide knowledge of the domain, thus requiring a high level of expertise and becoming it a prone-to-fail task. Based on our experience, we have detected a set of situations we have faced up with in real-world projects in which we believe that the use of ontologies will improve several aspects of the design of data warehouses. The aim of this article is to describe several shortcomings of current data warehouse design approaches and discuss the benefit of using ontologies to overcome them. This work is a starting point for discussing the convenience of using ontologies in data warehouse design.Comment: 15 pages, 2 figure

    Computer-Aided System for Wind Turbine Data Analysis

    Get PDF
    Context: The current work on wind turbine failure detection focuses on researching suitable signal processing algorithms and developing efficient diagnosis algorithms. The laboratory research would involve large and complex data, and it can be a daunting task. Aims: To develop a Computer-Aided system for assisting experts to conduct an efficient laboratory research on wind turbine data analysis. System is expected to provide data visualization, data manipulation, massive data processing and wind turbine failure detection. Method: 50G off-line SCADA data and 4 confident diagnosis algorithms were used in this project. Apart from the instructions from supervisor, this project also gained help from two experts from Engineering Department. Java and Microsoft SQL database were used to develop the system. Results: Data visualization provided 6 different charting solutions and together with robust user interactions. 4 failure diagnosis solutions and data manipulations were provided in the system. In addition, dedicated database server and Matlab API with Java RMI were used to resolve the massive data processing problem. Conclusions: Almost all of the deliverables were completed. Friendly GUI and useful functionalities make user feel more comfortable. The final product does enable experts to conduct an efficient laboratory research. The end of this project also gave some potential extensions of the system

    Second CLIPS Conference Proceedings, volume 1

    Get PDF
    Topics covered at the 2nd CLIPS Conference held at the Johnson Space Center, September 23-25, 1991 are given. Topics include rule groupings, fault detection using expert systems, decision making using expert systems, knowledge representation, computer aided design and debugging expert systems
    • …
    corecore