66 research outputs found

    A Review of Software Reliability Testing Techniques

    Get PDF
    In the era of intelligent systems, the safety and reliability of software have received more attention. Software reliability testing is a significant method to ensure reliability, safety and quality of software. The intelligent software technology has not only offered new opportunities but also posed challenges to software reliability technology. The focus of this paper is to explore the software reliability testing technology under the impact of intelligent software technology. In this study, the basic theories of traditional software and intelligent software reliability testing were investigated via related previous works, and a general software reliability testing framework was established. Then, the technologies of software reliability testing were analyzed, including reliability modeling, test case generation, reliability evaluation, testing criteria and testing methods. Finally, the challenges and opportunities of software reliability testing technology were discussed at the end of this paper

    Using AToM3 as a Meta-CASE Tool

    Full text link
    This is an electronic version of the paper presented at the 4th International Conference on Enterprise Information Systems, held in Ciudad Real on 200

    Harmonizing CMMI-DEV 1.2 and XP Method to Improve The Software Development Processes in Small Software Development Firms

    Get PDF
    Most software development organizations are small firms, and they have realized the need to manage and improve their software development and management activities. Traditional Software Process Improvement (SPI) models and standards are not realistic for these firms because of high cost, limited resources and strict project deadlines. Therefore, these firms need a lightweight software development method and an appropriate SPI model to manage and improve their software development and management processes. This study aims to construct a suitable software development process improvement framework for Small Software Development Firms (SSDFs) based on eXtreme Programming (XP) method and Capability Maturity Model Integration for Development Version 1.2 (CMMI-Dev1.2) model. Four stages are involved in developing the framework: (1) aligning XP practices to the specific goals of CMMI-Dev1.2 Key Process Areas (KPAs); (2) developing the proposed software development process improvement framework based on extending XP method by adapting the Extension-Based Approach (EBA), CMMI-Dev1.2, and generic elements of the SPI framework; (3) verifying the compatibility of the proposed framework to the KPAs of CMMI-Dev1.2 by using focus group method coupled with Delphi technique; and (4) validating the modified framework by using CMMI-Dev1.2 questionnaire as a main item to validate the suitability of the modified framework for SSDFs, and conducting two case studies to validate the applicability and effectiveness of this framework for these firms. The result of aligning XP practices to the KPAs of CMMI-Dev1.2 shows that twelve KPAs are largely supported by XP practices, eight KPAs are partially supported by XP practices, and two KPAs are not-supported by XP practices. The main contributions of this study are: software development process improvement framework for SSDFs, elicit better understanding of how to construct the framework, and quality improvement of the software development processes. There are possible avenues for extending this research to fulfil the missing specific practices of several KPAs, examining other agile practices and using CMMI-Dev1.3 to improve the framework, and conducting more case studie

    Challenges in the Evaluation of Observational Data Trustworthiness From a Data Producers Viewpoint (FAIR+)

    Get PDF
    Recent discussions in many scientific disciplines stress the necessity of “FAIR” data. FAIR data, however, does not necessarily include information on data trustworthiness, where trustworthiness comprises reliability, validity and provenience/provenance. This opens up the risk of misinterpreting scientific data, even though all criteria of “FAIR” are fulfilled. Especially applications such as secondary data processing, data blending, and joint interpretation or visualization efforts are affected. This paper intends to start a discussion in the scientific community about how to evaluate, describe, and implement trustworthiness in a standardized data evaluation approach and in its metadata description following the FAIR principles. It discusses exemplarily different assessment tools regarding soil moisture measurements, data processing and visualization and elaborates on which additional (metadata) information is required to increase the trustworthiness of data for secondary usage. Taking into account the perspectives of data collectors, providers and users, the authors identify three aspects of data trustworthiness that promote efficient data sharing: 1) trustworthiness of the measurement 2) trustworthiness of the data processing and 3) trustworthiness of the data integration and visualization. The paper should be seen as the basis for a community discussion on data trustworthiness for a scientifically correct secondary use of the data. We do not have the intention to replace existing procedures and do not claim completeness of reliable tools and approaches described. Our intention is to discuss several important aspects to assess data trustworthiness based on the data life cycle of soil moisture data as an example

    Game Portability Using a Service-Oriented Approach

    Get PDF
    Game assets are portable between games. The games themselves are, however, dependent on the game engine they were developed on. Middleware has attempted to address this by, for instance, separating out the AI from the core game engine. Our work takes this further by separating the game from the game engine, and making it portable between game engines. The game elements that we make portable are the game logic, the object model, and the game state, which represent the game's brain, and which we collectively refer to as the game factor, or G-factor. We achieve this using an architecture based around a service-oriented approach. We present an overview of this architecture and its use in developing games. The evaluation demonstrates that the architecture does not affect performance unduly, adds little development overhead, is scaleable, and supports modifiability

    Observational Robustness and Invariances in Reinforcement Learning via Lexicographic Objectives

    Full text link
    Policy robustness in Reinforcement Learning (RL) may not be desirable at any price; the alterations caused by robustness requirements from otherwise optimal policies should be explainable and quantifiable. Policy gradient algorithms that have strong convergence guarantees are usually modified to obtain robust policies in ways that do not preserve algorithm guarantees, which defeats the purpose of formal robustness requirements. In this work we study a notion of robustness in partially observable MDPs where state observations are perturbed by a noise-induced stochastic kernel. We characterise the set of policies that are maximally robust by analysing how the policies are altered by this kernel. We then establish a connection between such robust policies and certain properties of the noise kernel, as well as with structural properties of the underlying MDPs, constructing sufficient conditions for policy robustness. We use these notions to propose a robustness-inducing scheme, applicable to any policy gradient algorithm, to formally trade off the reward achieved by a policy with its robustness level through lexicographic optimisation, which preserves convergence properties of the original algorithm. We test the the proposed approach through numerical experiments on safety-critical RL environments, and show how the proposed method helps achieve high robustness when state errors are introduced in the policy roll-out

    Safe code transfromations for speculative execution in real-time systems

    Get PDF
    Although compiler optimization techniques are standard and successful in non-real-time systems, if naively applied, they can destroy safety guarantees and deadlines in hard real-time systems. For this reason, real-time systems developers have tended to avoid automatic compiler optimization of their code. However, real-time applications in several areas have been growing substantially in size and complexity in recent years. This size and complexity makes it impossible for real-time programmers to write optimal code, and consequently indicates a need for compiler optimization. Recently researchers have developed or modified analyses and transformations to improve performance without degrading worst-case execution times. Moreover, these optimization techniques can sometimes transform programs which may not meet constraints/deadlines, or which result in timeouts, into deadline-satisfying programs. One such technique, speculative execution, also used for example in parallel computing and databases, can enhance performance by executing parts of the code whose execution may or may not be needed. In some cases, rollback is necessary if the computation turns out to be invalid. However, speculative execution must be applied carefully to real-time systems so that the worst-case execution path is not extended. Deterministic worst-case execution for satisfying hard real-time constraints, and speculative execution with rollback for improving average-case throughput, appear to lie on opposite ends of a spectrum of performance requirements and strategies. Deterministic worst-case execution for satisfying hard real-time constraints, and speculative execution with rollback for improving average-case throughput, appear to lie on opposite ends of a spectrum of performance requirements and strategies. Nonetheless, this thesis shows that there are situations in which speculative execution can improve the performance of a hard real-time system, either by enhancing average performance while not affecting the worst-case, or by actually decreasing the worst-case execution time. The thesis proposes a set of compiler transformation rules to identify opportunities for speculative execution and to transform the code. Proofs for semantic correctness and timeliness preservation are provided to verify safety of applying transformation rules to real-time systems. Moreover, an extensive experiment using simulation of randomly generated real-time programs have been conducted to evaluate applicability and profitability of speculative execution. The simulation results indicate that speculative execution improves average execution time and program timeliness. Finally, a prototype implementation is described in which these transformations can be evaluated for realistic applications

    An implementation of feasible path constraints generation for reproducible testing.

    Get PDF
    Non-determinism features make the testing of a concurrent program not repeatable. Specification-based reproducible testing is a promising technique that may give the tester more control over the environment of concurrent testing. With a given test case, the crucial part of the test scenario which contributes to achieving the control on the execution path are input events and path constraints in terms of synchronization events. The problem considered in this thesis is to generate a significant set of path constraints automatically from the design specification in terms of design abstract under the assumption that monitors are the key mechanism to handle the synchronization events. In addition, as a considerable feature, formal methods have been applied in the implementation tool to construct the path constraints.Dept. of Computer Science. Paper copy at Leddy Library: Theses & Major Papers - Basement, West Bldg. / Call Number: Thesis2004 .L55. Source: Masters Abstracts International, Volume: 43-01, page: 0239. Adviser: Jessica Chen. Thesis (M.Sc.)--University of Windsor (Canada), 2004

    Applying patterns in embedded systems design for managing quality attributes and their trade-offs

    Get PDF
    corecore