3,016 research outputs found

    Design for diagnostics and prognostics:a physical- functional approach

    Get PDF

    Design for validation: An approach to systems validation

    Get PDF
    Every complex system built is validated in some manner. Computer validation begins with review of the system design. As systems became too complicated for one person to review, validation began to rely on the application of adhoc methods by many individuals. As the cost of the changes mounted and the expense of failure increased, more organized procedures became essential. Attempts at devising and carrying out those procedures showed that validation is indeed a difficult technical problem. The successful transformation of the validation process into a systematic series of formally sound, integrated steps is necessary if the liability inherent in the future digita-system-based avionic and space systems is to be minimized. A suggested framework and timetable for the transformtion are presented. Basic working definitions of two pivotal ideas (validation and system life-cyle) are provided and show how the two concepts interact. Many examples are given of past and present validation activities by NASA and others. A conceptual framework is presented for the validation process. Finally, important areas are listed for ongoing development of the validation process at NASA Langley Research Center

    A SOFTWARE TESTING ASSESSMENT TO MANAGE PROJECT TESTABILITY

    Get PDF
    The demand for testing services is, to a large extend a ?derived demand? influenced directly by the manner in which prior developed activities are undertaken. The early stages of a structured software development life cycle (SDLC) project can often run behind schedule, shrinking the time available for performing adequate testing especially when software release deadlines have to be met. This situation fosters the need to influence pre-testing activities and manage the testing effort efficiently. Our research examines how to measure testability of a SDLC project before testing begins. It builds on the ?design for testability? perspective by introducing a ?manage for testability? perspective. Software testability focuses on whether the activities of the SDLC process are progressing in ways that enable the testing team to find software product defects if they exist. To address this challenge, we develop a software testing assessment. This assessment is designed to provide testing managers with information needed to: (1) influence pre-testing activities in ways that ultimately increase testing efficiency and effectiveness, and (2) plan testing resources to optimize efficient and effective testing. We developed specific software testing assessment measures through interviews with key informants. We present data collected for the measures for large-scale structured software development projects to illustrate the assessment?s usefulness and application

    SOFTWARE TESTABILITY MEASURE FOR SAE ARCHITECTURE ANALYSIS AND DESIGN LANGUAGE (AADL)SOFTWARE TESTABILITY MEASURE FOR SAE ARCHITECTURE ANALYSIS AND DESIGN LANGUAGE (AADL)

    Get PDF
    Testability is an important quality attribute of software, especially for critical systems such as avionics, medical, and automotive. Improvement in the early testability of software architecture, the first artifact of the software system, will help reduce issues and costs later in the development process. AADL, an architecture analysis description language suitable for critical embedded, real-time systems, can be used for design documentation, analysis and code generation. Because the capability of AADL can be extended, it is possible to add new analyses to its core language. Tools such as the Open Source AADL Tool Environment (OSATE) provide plugins for processing AADL models. Although adding new plugins in OSATE extends AADL, there currently exists no AADL extension for testability measurement. The purpose of this thesis is to propose such a method to measure the testability of AADL models as well as to develop a testability plugin in OSATE. Much research has been conducted on testability of hardware, software and embedded systems, resulting in several approaches for measuring this quality attribute. Among them, the approach measuring testability as a product of controllability and observability using information transfer graph (ITG) is the most applicable for measuring the testability of AADL models. This thesis proposes a method applying this approach to AADL models. A complete testability measure plugin for OSATE was developed based on this approach and detailed examples are given in this thesis to demonstrate its applicability

    Use of COTS functional analysis software as an IVHM design tool for detection and isolation of UAV fuel system faults

    Get PDF
    This paper presents a new approach to the development of health management solutions which can be applied to both new and legacy platforms during the conceptual design phase. The approach involves the qualitative functional modelling of a system in order to perform an Integrated Vehicle Health Management (IVHM) design – the placement of sensors and the diagnostic rules to be used in interrogating their output. The qualitative functional analysis was chosen as a route for early assessment of failures in complex systems. Functional models of system components are required for capturing the available system knowledge used during various stages of system and IVHM design. MADe™ (Maintenance Aware Design environment), a COTS software tool developed by PHM Technology, was used for the health management design. A model has been built incorporating the failure diagrams of five failure modes for five different components of a UAV fuel system. Thus an inherent health management solution for the system and the optimised sensor set solution have been defined. The automatically generated sensor set solution also contains a diagnostic rule set, which was validated on the fuel rig for different operation modes taking into account the predicted fault detection/isolation and ambiguity group coefficients. It was concluded that when using functional modelling, the IVHM design and the actual system design cannot be done in isolation. The functional approach requires permanent input from the system designer and reliability engineers in order to construct a functional model that will qualitatively represent the real system. In other words, the physical insight should not be isolated from the failure phenomena and the diagnostic analysis tools should be able to adequately capture the experience bases. This approach has been verified on a laboratory bench top test rig which can simulate a range of possible fuel system faults. The rig is fully instrumented in order to allow benchmarking of various sensing solution for fault detection/isolation that were identified using functional analysis

    Survey of source code metrics for evaluating testability of object oriented systems

    No full text
    Software testing is costly in terms of time and funds. Testability is a software characteristic that aims at producing systems easy to test. Several metrics have been proposed to identify the testability weaknesses. But it is sometimes difficult to be convinced that those metrics are really related with testability. This article is a critical survey of the source-code based metrics proposed in the literature for object-oriented software testability. It underlines the necessity to provide testability metrics that are proved to be intuitive and adequate for the testing cost prediction

    Design and Evaluation of Radiation-Hardened Standard Cell Flip-Flops

    Get PDF
    Use of a standard non-rad-hard digital cell library in the rad-hard design can be a cost-effective solution for space applications. In this paper we demonstrate how a standard non-rad-hard flip-flop, as one of the most vulnerable digital cells, can be converted into a rad-hard flip-flop without modifying its internal structure. We present five variants of a Triple Modular Redundancy (TMR) flip-flop: baseline TMR flip-flop, latch-based TMR flip-flop, True-Single Phase Clock (TSPC) TMR flip-flop, scannable TMR flip-flop and self-correcting TMR flip-flop. For all variants, the multi-bit upsets have been addressed by applying special placement constraints, while the Single Event Transient (SET) mitigation was achieved through the usage of customized SET filters and selection of optimal inverter sizes for the clock and reset trees. The proposed flip-flop variants feature differing performance, thus enabling to choose the optimal solution for every sensitive node in the circuit, according to the predefined design constraints. Several flip-flop designs have been validated on IHP’s 130nm BiCMOS process, by irradiation of custom-designed shift registers. It has been shown that the proposed TMR flip-flops are robust to soft errors with a threshold Linear Energy Transfer (LET) from ( 32.4 (MeV⋅cm2/mg) ) to ( 62.5 (MeV⋅cm2/mg) ), depending on the variant

    A research review of quality assessment for software

    Get PDF
    Measures were recommended to assess the quality of software submitted to the AdaNet program. The quality factors that are important to software reuse are explored and methods of evaluating those factors are discussed. Quality factors important to software reuse are: correctness, reliability, verifiability, understandability, modifiability, and certifiability. Certifiability is included because the documentation of many factors about a software component such as its efficiency, portability, and development history, constitute a class for factors important to some users, not important at all to other, and impossible for AdaNet to distinguish between a priori. The quality factors may be assessed in different ways. There are a few quantitative measures which have been shown to indicate software quality. However, it is believed that there exists many factors that indicate quality and have not been empirically validated due to their subjective nature. These subjective factors are characterized by the way in which they support the software engineering principles of abstraction, information hiding, modularity, localization, confirmability, uniformity, and completeness

    LC-VCO design optimization methodology based on the gm/ID ratio for nanometer CMOS technologies

    Get PDF
    In this paper, an LC voltage-controlled oscillator (LC-VCO) design optimization methodology based on the gm/ID technique and on the exploration of all inversion regions of the MOS transistor (MOST) is presented. An in-depth study of the compromises between phase noise and current consumption permits optimization of the design for given specifications. Semiempirical models of MOSTs and inductors, obtained by simulation, jointly with analytical phase noise models, allow to get a design space map where the design tradeoffs are easily identified. Four LC-VCO designs in different inversion regions in a 90-nm CMOS process are obtained with the proposed methodology and verified with electrical simulations. Finally, the implementation and measurements are presented for a 2.4-GHz VCO operating in moderate inversion. The designed VCO draws 440 ÎĽA from a 1.2-V power supply and presents a phase noise of -106.2 dBc/Hz at 400 kHz from the carrier
    • …
    corecore