173 research outputs found

    A Survey on Data Integration in Data Warehouse

    Get PDF
    Data warehousing embraces technology of integrating data from multiple distributed data sources and using that at an in annotated and aggregated form to support business decision-making and enterprise management. Although many techniques have been revisited or newly  developed in the context of data warehouses, such as view maintenance and OLAP, little attention has been paid to data mining techniques for supporting the most important and costly tasks of data integration for data warehouse design

    Survey of Current Network Intrusion Detection Techniques

    Get PDF
    The significance of network security has grown enormously and a number of devices have been introduced to perk up the security of a network. NIDS is a retrofit approach for providing a sense of security in existing computers and data networks, while allowing them to operate in their current open mode. The goal of a network intrusion detection system is to identify, preferably in real time, unauthorized use, misuse and abuse of computer systems by insiders as well as from outside perpetrators. This paper presents a nomenclature of intrusion detection systems that is used to do a survey and identify a number of research prototypes.  Keywords: Security, Intrusion Detection, Misuse and Anomaly Detection, Pattern Matching

    Implementation of Anomaly Based Network Intrusion Detection by Using Q-learning Technique

    Get PDF
    Network Intrusion detection System (NIDS) is an intrusion detection system that tries to discover malicious activity such as service attacks, port scans or even attempts to break into computers by monitoring network traffic. Data mining techniques make it possible to search large amounts of data for characteristic rules and patterns. If applied to network monitoring data recorded on a host or in a network, they can be used to detect intrusions, attacks or anomalies. We proposed “machine learning method”, cascading Principal Component Analysis (PCA) and the Q-learning methods to classifying anomalous and normal activities in a computer network. This paper investigates the use of PCA to reduce high dimensional data and to improve the predictive performance. On the reduced data, representing a density region of normal or anomaly instances, Q-learning strategies are applied for the creation of agents that can adapt to unknown, complex environments. We attempted to create an agent that would learn to explore an environment and collect the malicious within it. We obtained interesting results where agents were able to re-adapt their learning quickly to the new traffic and network information as compare to the other machine learning method such as supervised learning and unsupervised learning. Keywords: Intrusion, Anomaly Detection, Data Mining, KDD Cup’99, PCA, Q-learning

    An Efficient Image Segmentation Approach through Enhanced Watershed Algorithm

    Get PDF
    Image segmentation is a significant task for image analysis which is at the middle layer of image engineering. The purpose of segmentation is to decompose the image into parts that are meaningful with respect to a particular application. The proposed system is to boost the morphological watershed method for degraded images. Proposed algorithm is based on merging morphological watershed result with enhanced edge detection result obtain on pre processing of degraded images. As a post processing step, to each of the segmented regions obtained, color histogram algorithm is applied, enhancing the overall performance of the watershed algorithm. Keywords – Segmentation, watershed, color histogra

    Improving Compressive Strength of Cement Concrete Mix by Using M-Sand and Bamboo Fiber

    Get PDF
    In the current world, concrete has become a very important part of the construction industry and the materials which are used in making concrete have evolved due to better quality of cement and better grade of coarse aggregates. The sand is an important part of concrete. It is mainly procured from natural sources. Thus the grade of sand is not under our control. The concrete cubes of M-20, M-25 & M-30 grade were threw in this trial explore work and tried to analyze different properties of concrete like compressive quality, workability. In this study M-sand is considered as a replacement of natural sand by 50, 70 & 90% by weight of sand in concrete design mix with 1% Bamboo fiber streams as an admixture. This study is carried out at the age of 7 and 28 days. In this work, the general properties of fresh and hardened concrete were tried and the outcomes were dissected. As concrete is a central material for the construction industry. In any case, in the present period where development is expanding quickly and development rate is coming to their statures, it is contrarily affecting our condition as well. So it is vital to utilize some optional materials as a part of concrete to minimize the cost, and to enhance the properties of concrete for better stability of the structures

    Mining Frequent Item Sets Data Streams using "ÉclatAlgorithm"

    Get PDF
    Frequent pattern mining is the process of mining data in a set of items or some patterns from a largedatabase. The resulted frequent set data supports the minimum support threshold. A frequentpattern is a pattern that occurs frequently in a dataset. Association rule mining is defined as to findout association rules that satisfy the predefined minimum support and confidence from a given database. If an item set is said to be frequent, that item set supports the minimum support andconfidence. A Frequent item set should appear in all the transaction of that data base. Discoveringfrequent item sets play a very important role in mining association rules, sequence rules, web logmining and many other interesting patterns among complex data. Data stream is a real timecontinuous, ordered sequence of items. It is an uninterrupted flow of a long sequence of data. Somereal time examples of data stream data are sensor network data, telecommunication data,transactional data and scientific surveillances systems. These data produced trillions of updatesevery day. So it is very difficult to store the entire data. In that time some mining process is required.Data mining is the non-trivial process of identifying valid, original, potentially useful and ultimatelyunderstandable patterns in data. It is an extraction of the hidden predictive information from largedata base. There are lots of algorithms used to find out the frequent item set. In that Apriorialgorithm is the very first classical algorithm used to find the frequent item set. Apart from Apriori,lots of algorithms generated but they are similar to Apriori. They are based on prune and candidategeneration. It takes more memory and time to find out the frequent item set. In this paper, we havestudied about how the éclat algorithm is used in data streams to find out the frequent item sets.Éclat algorithm need not required candidate generation

    An Interesting Case of Isolated Pancreatic Transection Following Blunt Abdominal Trauma in Emergency Department

    Get PDF
    Introduction: Traumatic injury to the pancreas is not common, but if the diagnosis is delayed or misdiagnosed in the emergency department (ED), the condition is associated with high morbidity and mortality and raises a question about the quality of emergency care. Here, we describe a rare case of blunt abdominal trauma resulted in isolated pancreas injury. Case presentation: A 25-year-old young male came to our emergency room (ER) in a conscious, anxious state from a nearby town with a history of roadside trauma. Further investigations revealed an isolated pancreatic injury due to trauma with no other major injuries, which occurred due to a sudden high-speed impact of the steering wheel to the epigastrium of a driver while driving the car, severely compressing the pancreas between the backbone and steering wheel. The patient was admitted to the intensive care unit for close observation and monitoring. He was managed conservatively on intravenous fluids, antibiotics, analgesics, and vasopressors. He was discharged after five days in a hemodynamically stable and afebrile condition, on a normal diet. Conclusion: Isolated pancreatic injury following blunt abdominal trauma is rare, and the symptoms are difficult to analyze early due to its retroperitoneal anatomy. Early detection and early intervention are important in the ED, and if left unrecognized, could result in a poor outcome

    Content Based Image Retrieval by Using Interactive Relevance Feedback Technique - A Survey

    Get PDF
    Due to rapid increase in storing and capturing multimedia data with the digital device, Content Based Image Retrieval play a very important role in the field of image processing. Although wide ranging studies have been done in the field of CBIR but image finding from multimedia data basis is still very complicated and open problem. If paper provide an review of CBIR based on some of the famous techniques such as Interactive Genetic Algorithm, Relevance Feedback (RS), Neural Network and so on. Relevance Feedback can be used to enhance the ability of CBIR effectively by dropping the semantic gap between low level feature and high level feature. Interactiveness on CBIR can also be done with the help of Genetic Algorithms. GA is the branch of evolutionary computation which makes the retrieval process more interactive so that user can get advanced results from database by comparing to Query Image with its evaluation. The result of traditional implicit feedback can also be improved by Neuro Fuzzy Logic based implicit feedback. This paper covers all the aspect of Relevance Feedback (RF), Interactive Genetic Algorithms, Neural Network in Content Based Image Retrieval, various RF techniques and applications of CBIR. DOI: 10.17762/ijritcc2321-8169.15075

    An Efficient identity based Multi-receiver Signcryption Scheme using ECC

    Get PDF
    ABSTRACT Signcryption is a technique of performing signature and encryption in a single logical step. It is a secure and efficient technique of providing security between the sender and the receiver so that the data send by the sender should be made secure from various types of attacks such as desynchronization attacks, identity disclosure attack and spoofing attacks. Although there are many technique implemented for the generation of signature and encryption. Here a new and efficient technique of signcryption has been implemented in a multireceiver environment on the basis of identity of the receiver. The proposed work given here is the implementation of signcryption scheme using elliptic curve cryptography where the authentication between sender and the receiver is based on the identity of the receiver
    • …
    corecore