418,158 research outputs found

    Uncovering Bugs in Distributed Storage Systems during Testing (not in Production!)

    Get PDF
    Testing distributed systems is challenging due to multiple sources of nondeterminism. Conventional testing techniques, such as unit, integration and stress testing, are ineffective in preventing serious but subtle bugs from reaching production. Formal techniques, such as TLA+, can only verify high-level specifications of systems at the level of logic-based models, and fall short of checking the actual executable code. In this paper, we present a new methodology for testing distributed systems. Our approach applies advanced systematic testing techniques to thoroughly check that the executable code adheres to its high-level specifications, which significantly improves coverage of important system behaviors. Our methodology has been applied to three distributed storage systems in the Microsoft Azure cloud computing platform. In the process, numerous bugs were identified, reproduced, confirmed and fixed. These bugs required a subtle combination of concurrency and failures, making them extremely difficult to find with conventional testing techniques. An important advantage of our approach is that a bug is uncovered in a small setting and witnessed by a full system trace, which dramatically increases the productivity of debugging

    Verifying Web Applications: From Business Level Specifications to Automated Model-Based Testing

    Full text link
    One of reasons preventing a wider uptake of model-based testing in the industry is the difficulty which is encountered by developers when trying to think in terms of properties rather than linear specifications. A disparity has traditionally been perceived between the language spoken by customers who specify the system and the language required to construct models of that system. The dynamic nature of the specifications for commercial systems further aggravates this problem in that models would need to be rechecked after every specification change. In this paper, we propose an approach for converting specifications written in the commonly-used quasi-natural language Gherkin into models for use with a model-based testing tool. We have instantiated this approach using QuickCheck and demonstrate its applicability via a case study on the eHealth system, the national health portal for Maltese residents.Comment: In Proceedings MBT 2014, arXiv:1403.704

    The JML and JUnit Way of Unit Testing and its Implementation

    Get PDF
    Writing unit test code is labor-intensive, hence it is often not done as an integral part of programming. However, unit testing is a practical approach to increasing the correctness and quality of software; for example, Extreme Programming relies on frequent unit testing. In this paper we present a new approach that makes writing unit tests easier. It uses a formal specification language\u27s runtime assertion checker to decide whether methods are working correctly; thus code to decide whether tests pass or fail is automatically produced from specifications. Our tool combines this testing code with hand-written test data to execute tests. Therefore, instead of writing testing code, the programmer writes formal specifications (e.g., pre- and postconditions). This makes the programmer\u27s task easier, because specifications are more concise and abstract than the equivalent test code, and hence more readable and maintainable. Furthermore, by using specifications in testing, specification errors are quickly discovered, so the specifications are more likely to provide useful documentation and inputs to other tools. In this paper we describe an implementation using the Java Modeling Language (JML) and the JUnit testing framework, but the approach could be easily implemented with other combinations of formal specification languages and unit testing tools

    Module documentation based testing using grey-box approach

    Get PDF
    Testing plays an important role to assure the quality of software. Testing is a process of detecting errors that can be highly effective if performed rigorously. The use of formal specifications provides significant opportunity to develop effective testing techniques. Grey-box testing approach usually based on knowledge obtains from specification and source code while seldom the design specification is concerned. In this paper, we propose an approach for testing a module with internal memory from its formal specification based on grey-box approach. We use formal specifications that are documented using Parnas’s Module Documentation (MD) method. The MD provides us with the information of external and internal view of a module that can be useful in grey-box testing approach

    Testing a system specified using Statecharts and Z

    Get PDF
    A hybrid specification language SZ, in which the dynamic behaviour of a system is described using Statecharts and the data and the data transformations are described using Z, has been developed for the specification of embedded systems. This paper describes an approach to testing from a deterministic sequential specification written in SZ. By considering the Z specifications of the operations, the extended finite state machine (EFSM) defined by the Statechart can be rewritten to produce an EFSM that has a number of properties that simplify test generation. Test generation algorithms are introduced and applied to an example. While this paper considers SZ specifications, the approaches described might be applied whenever the specification is an EFSM whose states and transitions are specified using a language similar to Z

    Regression modeling for digital test of ΣΔ modulators

    Get PDF
    The cost of Analogue and Mixed-Signal circuit testing is an important bottleneck in the industry, due to timeconsuming verification of specifications that require state-ofthe- art Automatic Test Equipment. In this paper, we apply the concept of Alternate Test to achieve digital testing of converters. By training an ensemble of regression models that maps simple digital defect-oriented signatures onto Signal to Noise and Distortion Ratio (SNDR), an average error of 1:7% is achieved. Beyond the inference of functional metrics, we show that the approach can provide interesting diagnosis information.Ministerio de Educación y Ciencia TEC2007-68072/MICJunta de Andalucía TIC 5386, CT 30

    A Simple and Practical Approach to Unit Testing: The JML and JUnit Way

    Get PDF
    Writing unit test code is labor-intensive, hence it is often not done as an integral part of programming. However, unit testing is a practical approach to increasing the correctness and quality of software; for example, the Extreme Programming approach relies on frequent unit testing. In this paper we present a new approach that makes writing unit tests easier. It uses a formal specification language\u27s runtime assertion checker to decide whether methods are working correctly, thus automating the writing of unit test oracles. These oracles can be easily combined with hand-written test data. Instead of writing testing code, the programmer writes formal specifications (e.g., pre- and postconditions). This makes the programmer\u27s task easier, because specifications are more concise and abstract than the equivalent test code, and hence more readable and maintainable. Furthermore, by using specifications in testing, specification errors are quickly discovered, so the specifications are more likely to provide useful documentation and inputs to other tools. We have implemented this idea using the Java Modeling Language (JML) and the JUnit testing framework, but the approach could be easily implemented with other combinations of formal specification languages and unit test tools

    A Simple and Practical Approach to Unit Testing: The JML and JUnit Way

    Get PDF
    Writing unit test code is labor-intensive, hence it is often not done as an integral part of programming. However, unit testing is a practical approach to increasing the correctness and quality of software; for example, the Extreme Programming approach relies on frequent unit testing. In this paper we present a new approach that makes writing unit tests easier. It uses a formal specification language\u27s runtime assertion checker to decide whether methods are working correctly, thus automating the writing of unit test oracles. These oracles can be easily combined with hand-written test data. Instead of writing testing code, the programmer writes formal specifications (e.g., pre- and postconditions). This makes the programmer\u27s task easier, because specifications are more concise and abstract than the equivalent test code, and hence more readable and maintainable. Furthermore, by using specifications in testing, specification errors are quickly discovered, so the specifications are more likely to provide useful documentation and inputs to other tools. We have implemented this idea using the Java Modeling Language (JML) and the JUnit testing framework, but the approach could be easily implemented with other combinations of formal specification languages and unit test tools
    corecore