7 research outputs found

    Software testing using high-performance computers

    No full text
    Reliable software testing is a time consuming operation. In addition to the time spent by the tester in identifying, locating, and correcting bugs, a significant time is spent in the execution of the program under test and its instrumented or fault induced variants. When using mutation based testing to achieve high reliability, the number of such variants can be large. Providing a software testing tool that can efficiently exploit the architecture of a parallel machine implies providing more computing power to the software tester and hence an opportunity to improve the reliability of the product being developed. In this thesis, we consider the problem of utilizing high performance computers to improve the quality of software. We describe three approaches to the parallelization of mutant execution on three architectures: MIMD, Vector, and MIMD with vector processors. We describe the architecture of the P\sp{\rm M}othra system designed to provide the tester a transparent interface to parallel machines. A prototype, constructed by interfacing the Mothra system to an Ncube through a scheduler, was used to conduct the experiments reported in this dissertation. Analysis of algorithms developed and experimental results obtained on these three architecture are presented. Our results enable us to conclude that the MIMD machine, as typified by the Ncube, is superior to some other architectures for mutation based software testing

    Automatic Method for Distinguishing Hardware and Software Faults Based on Software Execution Data and Hardware Performance Counters

    No full text
    Debugging in an embedded system where hardware and software are tightly coupled and have restricted resources is far from trivial. When hardware defects appear as if they were software defects, determining the real source becomes challenging. In this study, we propose an automated method of distinguishing whether a defect originates from the hardware or software at the stage of integration testing of hardware and software. Our method overcomes the limitations of the embedded environment, minimizes the effects on runtime, and identifies defects by obtaining and analyzing software execution data and hardware performance counters. We analyze the effects of the proposed method through an empirical study. The experimental results reveal that our method can effectively distinguish defects

    An Approach to Developing a Performance Test Based on the Tradeoffs from SW Architectures

    No full text

    A Test Case Prioritization through Correlation of Requirement and Risk

    No full text

    Value-Driven V-Model: From Requirements Analysis to Acceptance Testing

    No full text
    corecore