21,725 research outputs found

    Checking experiments for stream X-machines

    Get PDF
    This article is a post-print version of the published article which may be accessed at the link below. Copyright Ā© 2010 Elsevier B.V. All rights reserved.Stream X-machines are a state based formalism that has associated with it a particular development process in which a system is built from trusted components. Testing thus essentially checks that these components have been combined in a correct manner and that the orders in which they can occur are consistent with the specification. Importantly, there are test generation methods that return a checking experiment: a test that is guaranteed to determine correctness as long as the implementation under test (IUT) is functionally equivalent to an unknown element of a given fault domain ĪØ. Previous work has show how three methods for generating checking experiments from a finite state machine (FSM) can be adapted to testing from a stream X-machine. However, there are many other methods for generating checking experiments from an FSM and these have a variety of benefits that correspond to different testing scenarios. This paper shows how any method for generating a checking experiment from an FSM can be adapted to generate a checking experiment for testing an implementation against a stream X-machine. This is the case whether we are testing to check that the IUT is functionally equivalent to a specification or we are testing to check that every trace (input/output sequence) of the IUT is also a trace of a nondeterministic specification. Interestingly, this holds even if the fault domain ĪØ used is not that traditionally associated with testing from a stream X-machine. The results also apply for both deterministic and nondeterministic implementations

    AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments

    Get PDF
    This report considers the application of Articial Intelligence (AI) techniques to the problem of misuse detection and misuse localisation within telecommunications environments. A broad survey of techniques is provided, that covers inter alia rule based systems, model-based systems, case based reasoning, pattern matching, clustering and feature extraction, articial neural networks, genetic algorithms, arti cial immune systems, agent based systems, data mining and a variety of hybrid approaches. The report then considers the central issue of event correlation, that is at the heart of many misuse detection and localisation systems. The notion of being able to infer misuse by the correlation of individual temporally distributed events within a multiple data stream environment is explored, and a range of techniques, covering model based approaches, `programmed' AI and machine learning paradigms. It is found that, in general, correlation is best achieved via rule based approaches, but that these suffer from a number of drawbacks, such as the difculty of developing and maintaining an appropriate knowledge base, and the lack of ability to generalise from known misuses to new unseen misuses. Two distinct approaches are evident. One attempts to encode knowledge of known misuses, typically within rules, and use this to screen events. This approach cannot generally detect misuses for which it has not been programmed, i.e. it is prone to issuing false negatives. The other attempts to `learn' the features of event patterns that constitute normal behaviour, and, by observing patterns that do not match expected behaviour, detect when a misuse has occurred. This approach is prone to issuing false positives, i.e. inferring misuse from innocent patterns of behaviour that the system was not trained to recognise. Contemporary approaches are seen to favour hybridisation, often combining detection or localisation mechanisms for both abnormal and normal behaviour, the former to capture known cases of misuse, the latter to capture unknown cases. In some systems, these mechanisms even work together to update each other to increase detection rates and lower false positive rates. It is concluded that hybridisation offers the most promising future direction, but that a rule or state based component is likely to remain, being the most natural approach to the correlation of complex events. The challenge, then, is to mitigate the weaknesses of canonical programmed systems such that learning, generalisation and adaptation are more readily facilitated

    TROUBLE 3: A fault diagnostic expert system for Space Station Freedom's power system

    Get PDF
    Designing Space Station Freedom has given NASA many opportunities to develop expert systems that automate onboard operations of space based systems. One such development, TROUBLE 3, an expert system that was designed to automate the fault diagnostics of Space Station Freedom's electric power system is described. TROUBLE 3's design is complicated by the fact that Space Station Freedom's power system is evolving and changing. TROUBLE 3 has to be made flexible enough to handle changes with minimal changes to the program. Three types of expert systems were studied: rule-based, set-covering, and model-based. A set-covering approach was selected for TROUBLE 3 because if offered the needed flexibility that was missing from the other approaches. With this flexibility, TROUBLE 3 is not limited to Space Station Freedom applications, it can easily be adapted to handle any diagnostic system

    Testing conformance of a deterministic implementation against a non-deterministic stream X-machine

    Get PDF
    Stream X-machines are a formalisation of extended finite state machines that have been used to specify systems. One of the great benefits of using stream X-machines, for the purpose of specification, is the associated test generation technique which produces a test that is guaranteed to determine correctness under certain design for test conditions. This test generation algorithm has recently been extended to the case where the specification is non-deterministic. However, the algorithms for testing from a non-deterministic stream X-machine currently have limitations: either they test for equivalence, rather than conformance or they restrict the source of non-determinism allowed in the specification. This paper introduces a new test generation algorithm that overcomes both of these limitations, for situations where the implementation is known to be deterministic

    Network Localization by Shadow Edges

    Full text link
    Localization is a fundamental task for sensor networks. Traditional network construction approaches allow to obtain localized networks requiring the nodes to be at least tri-connected (in 2D), i.e., the communication graph needs to be globally rigid. In this paper we exploit, besides the information on the neighbors sensed by each robot/sensor, also the information about the lack of communication among nodes. The result is a framework where the nodes are required to be bi-connected and the communication graph has to be rigid. This is possible considering a novel typology of link, namely Shadow Edges, that account for the lack of communication among nodes and allow to reduce the uncertainty associated to the position of the nodes.Comment: preprint submitted to 2013 European Control Conference, July 17-19 2013, Zurich, Switzerlan
    • ā€¦
    corecore