5,141 research outputs found

    A unified model for protocol test suite design

    Get PDF
    This thesis is concerned with developing new algorithms for solving some basic problems of conformance testing. In particular, the following problems of conformance testing are considered: (i) generation of test cases from Language Of Temporal Ordering Specification (LOTOS) and Specification and Description Language (SDL), (ii) selection of test cases which meet certain data flow coverage criteria, and (iii) representation of test cases for Local Single-layer (LS) and Remote Single-layer (RS) architectures. The algorithms presented in this thesis can be used to solve in an efficient manner these fundamental problems of conformance testing. These algorithms rely heavily on two concepts: the Extended Finite State Machine (EFSM) chart and the Input/Output (I/O) diagram. In this thesis, we introduce a unified model (using the EFSM chart and the I/O diagram) for existing protocol specification languages. Based on the new unified model, a conceptually simple, easy to implement and computationally efficient methodology is proposed in this thesis for studying conformance testing. In this thesis, the protocol specification is mapped into an EFSM chart. The structure of input/output data is modeled by hierarchical diagrams called I/O diagrams. Test cases are generated from the EFSM chart. Furthermore, a data flow graph is constructed from the chart, and used to identify the protocol functions for testing the data flow aspects of an Implementation Under Test (IUT). The zero-one integer programming technique is used to select test cases to meet the data flow coverage requirement. The selected test cases are modeled as a dependency graph and then evaluated by taking predicate slices from the test case dependency graph. Predicate slices are used to identify infeasible test cases that must be eliminated. Redundant assignments and predicates in all the feasible test cases are removed by reducing the test cases. Reduction is achieved by using the test case dependency graph as well as the data flow graph. The reduced test case dependency graph is adapted for LS and RS architectures. The tester's behaviour in each test case is obtained by a series of transformations called representation and selection. Test case representation refers to the steps of inverting the direction of events and the generation of base and dynamic constraints on the events. These constraints are generated in the form of an I/O diagram. Test case selection refers to the steps of assigning a test purpose according to the hierarchy of test cases in a test suite and then completing the tester's behaviour by assigning verdict and parameter value information

    Enabling Proactive Adaptation through Just-in-time Testing of Conversational Services

    No full text
    Service-based applications (SBAs) will increasingly be composed of third-party services available over the Internet. Reacting to failures of those third-party services by dynamically adapting the SBAs will become a key enabler for ensuring reliability. Determining when to adapt an SBA is especially challenging in the presence of conversational (aka. stateful) services. A conversational service might fail in the middle of an invocation sequence, in which case adapting the SBA might be costly; e.g., due to the necessary state transfer to an alternative service. In this paper we propose just-in-time testing of conversational services as a novel approach to detect potential problems and to proactively trigger adaptations, thereby preventing costly compensation activities. The approach is based on a framework for online testing and a formal test-generation method which guarantees functional correctness for conversational services. The applicability of the approach is discussed with respect to its underlying assumptions and its performance. The benefits of the approach are demonstrated using a realistic example

    Estimating the feasibility of transition paths in extended finite state machines

    Get PDF
    There has been significant interest in automating testing on the basis of an extended finite state machine (EFSM) model of the required behaviour of the implementation under test (IUT). Many test criteria require that certain parts of the EFSM are executed. For example, we may want to execute every transition of the EFSM. In order to find a test suite (set of input sequences) that achieves this we might first derive a set of paths through the EFSM that satisfy the criterion using, for example, algorithms from graph theory. We then attempt to produce input sequences that trigger these paths. Unfortunately, however, the EFSM might have infeasible paths and the problem of determining whether a path is feasible is generally undecidable. This paper describes an approach in which a fitness function is used to estimate how easy it is to find an input sequence to trigger a given path through an EFSM. Such a fitness function could be used in a search-based approach in which we search for a path with good fitness that achieves a test objective, such as executing a particular transition, and then search for an input sequence that triggers the path. If this second search fails then we search for another path with good fitness and repeat the process. We give a computationally inexpensive approach (fitness function) that estimates the feasibility of a path. In order to evaluate this fitness function we compared the fitness of a path with the ease with which an input sequence can be produced using search to trigger the path and we used random sampling in order to estimate this. The empirical evidence suggests that a reasonably good correlation (0.72 and 0.62) exists between the fitness of a path, produced using the proposed fitness function, and an estimate of the ease with which we can randomly generate an input sequence to trigger the path

    The history of WiMAX: a complete survey of the evolution in certification and standarization for IEEE 802.16 and WiMAX

    Get PDF
    Most researchers are familiar with the technical features of WiMAX technology but the evolution that WiMAX went through, in terms of standardization and certification, is missing and unknown to most people. Knowledge of this historical process would however aid to understand how WiMAX has become the widespread technology that it is today. Furthermore, it would give insight in the steps to undertake for anyone aiming at introducing a new wireless technology on a worldwide scale. Therefore, this article presents a survey on all relevant activities that took place within three important organizations: the 802.16 Working Group of the IEEE (Institute of Electrical and Electronics Engineers) for technology development and standardization, the WiMAX Forum for product certification and the ITU (International Telecommunication Union) for international recognition. An elaborated and comprehensive overview of all those activities is given, which reveals the importance of the willingness to innovate and to continuously incorporate new ideas in the IEEE standardization process and the importance of the WiMAX Forum certification label granting process to ensure interoperability. We also emphasize the steps that were taken in cooperating with the ITU to improve the international esteem of the technology. Finally, a WiMAX trend analysis is made. We showed how industry interest has fluctuated over time and quantified the evolution in WiMAX product certification and deployments. It is shown that most interest went to the 2.5 GHz and 3.5GHz frequencies, that most deployments are in geographic regions with a lot of developing countries and that the highest people coverage is achieved in Asia Pacific. This elaborated description of all standardization and certification activities, from the very start up to now, will make the reader comprehend how past and future steps are taken in the development process of new WiMAX features
    • ā€¦
    corecore