5,163 research outputs found

    UML-based specification, validation, and log-file based verification of the Orion Pad Abort Software

    Get PDF
    This paper described the first end to end application of a novel light weight formal specification, validation, and verification technique. The technique is novel is two aspects. First, it uses an intuitive, familiar, and diagrammatic notation for formal specification, a notation that being Turing equivalent and supports the capture of real-life requirements. Second, the technique includes a computer aided approach for validating the correctness of requirements early in the development process, allowing sufficient time for the correction of ambiguous and underspecified requirements. In the verification phase the technique is based on off-line verification using log-files. This approach scales well and is applicable to almost every mission critical system, including real-time systems. The paper describes the application of this technique towards the specification, validation, and verification of the Pad Abort subsystem of NASA's Orion mission.Approved for public release; distribution is unlimited

    An Adaptive Design Methodology for Reduction of Product Development Risk

    Full text link
    Embedded systems interaction with environment inherently complicates understanding of requirements and their correct implementation. However, product uncertainty is highest during early stages of development. Design verification is an essential step in the development of any system, especially for Embedded System. This paper introduces a novel adaptive design methodology, which incorporates step-wise prototyping and verification. With each adaptive step product-realization level is enhanced while decreasing the level of product uncertainty, thereby reducing the overall costs. The back-bone of this frame-work is the development of Domain Specific Operational (DOP) Model and the associated Verification Instrumentation for Test and Evaluation, developed based on the DOP model. Together they generate functionally valid test-sequence for carrying out prototype evaluation. With the help of a case study 'Multimode Detection Subsystem' the application of this method is sketched. The design methodologies can be compared by defining and computing a generic performance criterion like Average design-cycle Risk. For the case study, by computing Average design-cycle Risk, it is shown that the adaptive method reduces the product development risk for a small increase in the total design cycle time.Comment: 21 pages, 9 figure

    What May Visualization Processes Optimize?

    Full text link
    In this paper, we present an abstract model of visualization and inference processes and describe an information-theoretic measure for optimizing such processes. In order to obtain such an abstraction, we first examined six classes of workflows in data analysis and visualization, and identified four levels of typical visualization components, namely disseminative, observational, analytical and model-developmental visualization. We noticed a common phenomenon at different levels of visualization, that is, the transformation of data spaces (referred to as alphabets) usually corresponds to the reduction of maximal entropy along a workflow. Based on this observation, we establish an information-theoretic measure of cost-benefit ratio that may be used as a cost function for optimizing a data visualization process. To demonstrate the validity of this measure, we examined a number of successful visualization processes in the literature, and showed that the information-theoretic measure can mathematically explain the advantages of such processes over possible alternatives.Comment: 10 page

    VERIFICATION AND VALIDATION OF A SOFTWARE: A REVIEW OF THE LITERATURE

    Get PDF
    With the development of the Internet, making software is often essential, also it is complicated to succeed in the project’s development. There is a necessity in delivering software of top quality. It might be accomplished through using the procedures of Verification and Validation (V&V) via development processes. The main aim of the V&V has been checking if the created software is meeting the needs and specifications of clients. V&V has been considered as collections related to testing as well as analysis activities across the software’s full life cycle. Quick developments in software V&V were of high importance in developing approaches and tools for identifying possible concurrent bugs and therefore verifying the correctness of software. It has been reflecting the modern software V&V concerning efficiency. The main aim of this study has been retrospective review related to various researches in software V&V and conduct a comparison between them.               In the modern competitive world related to the software, the developers of software must be delivering on-time quality products, also the developers should be verifying that the software has been properly functioning and validating the product for each one of the client’s requirements. The significance of V&V in the development of software has been maintaining the quality of software. The approaches of V&V have been utilized in all stages of the System Development Life Cycle. Furthermore, the presented study also provides objectives of V&V and describes V&V tools that can be used in the process of software development, the way of improving the software’s quality

    Exploring formal verification methodology for FPGA-based digital systems.

    Full text link
    Abstract Not Provide
    • …
    corecore