70,993 research outputs found

    Estimating the feasibility of transition paths in extended finite state machines

    Get PDF
    There has been significant interest in automating testing on the basis of an extended finite state machine (EFSM) model of the required behaviour of the implementation under test (IUT). Many test criteria require that certain parts of the EFSM are executed. For example, we may want to execute every transition of the EFSM. In order to find a test suite (set of input sequences) that achieves this we might first derive a set of paths through the EFSM that satisfy the criterion using, for example, algorithms from graph theory. We then attempt to produce input sequences that trigger these paths. Unfortunately, however, the EFSM might have infeasible paths and the problem of determining whether a path is feasible is generally undecidable. This paper describes an approach in which a fitness function is used to estimate how easy it is to find an input sequence to trigger a given path through an EFSM. Such a fitness function could be used in a search-based approach in which we search for a path with good fitness that achieves a test objective, such as executing a particular transition, and then search for an input sequence that triggers the path. If this second search fails then we search for another path with good fitness and repeat the process. We give a computationally inexpensive approach (fitness function) that estimates the feasibility of a path. In order to evaluate this fitness function we compared the fitness of a path with the ease with which an input sequence can be produced using search to trigger the path and we used random sampling in order to estimate this. The empirical evidence suggests that a reasonably good correlation (0.72 and 0.62) exists between the fitness of a path, produced using the proposed fitness function, and an estimate of the ease with which we can randomly generate an input sequence to trigger the path

    Combining Clustering techniques and Formal Concept Analysis to characterize Interestingness Measures

    Full text link
    Formal Concept Analysis "FCA" is a data analysis method which enables to discover hidden knowledge existing in data. A kind of hidden knowledge extracted from data is association rules. Different quality measures were reported in the literature to extract only relevant association rules. Given a dataset, the choice of a good quality measure remains a challenging task for a user. Given a quality measures evaluation matrix according to semantic properties, this paper describes how FCA can highlight quality measures with similar behavior in order to help the user during his choice. The aim of this article is the discovery of Interestingness Measures "IM" clusters, able to validate those found due to the hierarchical and partitioning clustering methods "AHC" and "k-means". Then, based on the theoretical study of sixty one interestingness measures according to nineteen properties, proposed in a recent study, "FCA" describes several groups of measures.Comment: 13 pages, 2 figure

    A Concurrent Perspective on Smart Contracts

    Get PDF
    In this paper, we explore remarkable similarities between multi-transactional behaviors of smart contracts in cryptocurrencies such as Ethereum and classical problems of shared-memory concurrency. We examine two real-world examples from the Ethereum blockchain and analyzing how they are vulnerable to bugs that are closely reminiscent to those that often occur in traditional concurrent programs. We then elaborate on the relation between observable contract behaviors and well-studied concurrency topics, such as atomicity, interference, synchronization, and resource ownership. The described contracts-as-concurrent-objects analogy provides deeper understanding of potential threats for smart contracts, indicate better engineering practices, and enable applications of existing state-of-the-art formal verification techniques.Comment: 15 page

    Investigating people: a qualitative analysis of the search behaviours of open-source intelligence analysts

    Get PDF
    The Internet and the World Wide Web have become integral parts of the lives of many modern individuals, enabling almost instantaneous communication, sharing and broadcasting of thoughts, feelings and opinions. Much of this information is publicly facing, and as such, it can be utilised in a multitude of online investigations, ranging from employee vetting and credit checking to counter-terrorism and fraud prevention/detection. However, the search needs and behaviours of these investigators are not well documented in the literature. In order to address this gap, an in-depth qualitative study was carried out in cooperation with a leading investigation company. The research contribution is an initial identification of Open-Source Intelligence investigator search behaviours, the procedures and practices that they undertake, along with an overview of the difficulties and challenges that they encounter as part of their domain. This lays the foundation for future research in to the varied domain of Open-Source Intelligence gathering

    MetTeL: A Generic Tableau Prover.

    Get PDF

    Applying Formal Methods to Networking: Theory, Techniques and Applications

    Full text link
    Despite its great importance, modern network infrastructure is remarkable for the lack of rigor in its engineering. The Internet which began as a research experiment was never designed to handle the users and applications it hosts today. The lack of formalization of the Internet architecture meant limited abstractions and modularity, especially for the control and management planes, thus requiring for every new need a new protocol built from scratch. This led to an unwieldy ossified Internet architecture resistant to any attempts at formal verification, and an Internet culture where expediency and pragmatism are favored over formal correctness. Fortunately, recent work in the space of clean slate Internet design---especially, the software defined networking (SDN) paradigm---offers the Internet community another chance to develop the right kind of architecture and abstractions. This has also led to a great resurgence in interest of applying formal methods to specification, verification, and synthesis of networking protocols and applications. In this paper, we present a self-contained tutorial of the formidable amount of work that has been done in formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial

    ANCHOR: logically-centralized security for Software-Defined Networks

    Get PDF
    While the centralization of SDN brought advantages such as a faster pace of innovation, it also disrupted some of the natural defenses of traditional architectures against different threats. The literature on SDN has mostly been concerned with the functional side, despite some specific works concerning non-functional properties like 'security' or 'dependability'. Though addressing the latter in an ad-hoc, piecemeal way, may work, it will most likely lead to efficiency and effectiveness problems. We claim that the enforcement of non-functional properties as a pillar of SDN robustness calls for a systemic approach. As a general concept, we propose ANCHOR, a subsystem architecture that promotes the logical centralization of non-functional properties. To show the effectiveness of the concept, we focus on 'security' in this paper: we identify the current security gaps in SDNs and we populate the architecture middleware with the appropriate security mechanisms, in a global and consistent manner. Essential security mechanisms provided by anchor include reliable entropy and resilient pseudo-random generators, and protocols for secure registration and association of SDN devices. We claim and justify in the paper that centralizing such mechanisms is key for their effectiveness, by allowing us to: define and enforce global policies for those properties; reduce the complexity of controllers and forwarding devices; ensure higher levels of robustness for critical services; foster interoperability of the non-functional property enforcement mechanisms; and promote the security and resilience of the architecture itself. We discuss design and implementation aspects, and we prove and evaluate our algorithms and mechanisms, including the formalisation of the main protocols and the verification of their core security properties using the Tamarin prover.Comment: 42 pages, 4 figures, 3 tables, 5 algorithms, 139 reference

    Integrated testing and verification system for research flight software design document

    Get PDF
    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically
    • …
    corecore