156 research outputs found

    Problems faced in Communicate set up of Coordinator with GUI and Dispatcher in NCTUns network simulator

    Get PDF
    Distributed Emulation can be carried out between real world applications with a simulator. NCTUns-6.0 is a simulator based on kernel re-entering and distributed Emulation simulation methodology, with this methodology a more realistic network model can be simulated for real time applications. Coordinator is a Program that sets up communication between the GUI and Dispatcher in NCTUns-6.0 I, a network simulator running on Fedora-13. We have given in this paper how problems faced in communication set up of Coordinator with  GUI and Dispatcher are overcomed . This paper focuses on how to overcome the above problems and execute simulations in real time. This paper also deals with KDE, SELINUX, iptables, environment variables and X-server for NCTUns-6.0 simulation engine Keywords: Coordinator, Dispatcher, nctunsclient, NCTUns-6.0, Fedora-12, SELinux, IPtable

    Free-libre open source software as a public policy choice

    Get PDF
    Free Libre Open Source Software (FLOSS) is characterised by a specific programming and development paradigm. The availability and freedom of use of source code are at the core of this paradigm, and are the prerequisites for FLOSS features. Unfortunately, the fundamental role of code is often ignored among those who decide the software purchases for Canadian public agencies. Source code availability and the connected freedoms are often seen as unrelated and accidental aspects, and the only real advantage acknowledged, which is the absence of royalty fees, becomes paramount. In this paper we discuss some relevant legal issues and explain why public administrations should choose FLOSS for their technological infrastructure. We also present the results of a survey regarding the penetration and awareness of FLOSS usage into the Government of Canada. The data demonstrates that the Government of Canada shows no enforced policy regarding the implementation of a specific technological framework (which has legal, economic, business, and ethical repercussions) in their departments and agencies

    Design of an advanced intelligent instrument with waveform recognition based on the ITMS platform

    Get PDF
    Searching for similar behavior in previous data plays a key role in fusion research, but can be quite challenging to implement from a practical point of view. This paper describes the design of an intelligent measurement instrument that uses similar waveform recognition systems (SWRS) to extract knowledge from the signals it acquires. The system is perceived as an Ethernet measurement instrument that permits to acquire several waveforms simultaneously and to identity similar behaviors by searching in previous data using distributed SWRS. The implementation is another example of the advantages that local processing capabilities can provide in data acquisition applications

    A traffic classification method using machine learning algorithm

    Get PDF
    Applying concepts of attack investigation in IT industry, this idea has been developed to design a Traffic Classification Method using Data Mining techniques at the intersection of Machine Learning Algorithm, Which will classify the normal and malicious traffic. This classification will help to learn about the unknown attacks faced by IT industry. The notion of traffic classification is not a new concept; plenty of work has been done to classify the network traffic for heterogeneous application nowadays. Existing techniques such as (payload based, port based and statistical based) have their own pros and cons which will be discussed in this literature later, but classification using Machine Learning techniques is still an open field to explore and has provided very promising results up till now

    Virtualization Components of the Modern Hypervisor

    Get PDF
    Virtualization is the foundation on which cloud services build their business. It supports the infrastructure for the largest companies around the globe and is a key component for scaling software for the ever-growing technology industry. If companies decide to use virtualization as part of their infrastructure it is important for them to quickly and reliably have a way to choose a virtualization technology and tweak the performance of that technology to fit their intended usage. Unfortunately, while many papers exist discussing and testing the performance of various virtualization systems, most of these performance tests do not take into account components that can be configured to improve performance for certain scenarios. This study provides a comparison of how three hypervisors (VMWare vSphere, Citrix XenServer, and KVM) perform under different sets of configurations at this point and which system workloads would be ideal for these configurations. This study also provides a means in which to compare different configurations with each other so that implementers of these technologies have a way in which to make informed decisions on which components should be enabled for their current or future systems

    The Advanced Framework for Evaluating Remote Agents (AFERA): A Framework for Digital Forensic Practitioners

    Get PDF
    Digital forensics experts need a dependable method for evaluating evidence-gathering tools. Limited research and resources challenge this process and the lack of multi-endpoint data validation hinders reliability in distributed digital forensics. A framework was designed to evaluate distributed agent-based forensic tools while enabling practitioners to self-evaluate and demonstrate evidence reliability as required by the courts. Grounded in Design Science, the framework features guidelines, data, criteria, and checklists. Expert review enhances its quality and practicality

    Minimization of DDoS false alarm rate in Network Security; Refining fusion through correlation

    Get PDF
    Intrusion Detection Systems are designed to monitor a network environment and generate alerts whenever abnormal activities are detected. However, the number of these alerts can be very large making their evaluation a difficult task for a security analyst. Alert management techniques reduce alert volume significantly and potentially improve detection performance of an Intrusion Detection System. This thesis work presents a framework to improve the effectiveness and efficiency of an Intrusion Detection System by significantly reducing the false positive alerts and increasing the ability to spot an actual intrusion for Distributed Denial of Service attacks. Proposed sensor fusion technique addresses the issues relating the optimality of decision-making through correlation in multiple sensors framework. The fusion process is based on combining belief through Dempster Shafer rule of combination along with associating belief with each type of alert and combining them by using Subjective Logic based on Jøsang theory. Moreover, the reliability factor for any Intrusion Detection System is also addressed accordingly in order to minimize the chance of false diagnose of the final network state. A considerable number of simulations are conducted in order to determine the optimal performance of the proposed prototype
    • …
    corecore