368,822 research outputs found

    Application of Safety Verification Methodology Framework During Software Development Phases

    Get PDF
    In order to detect and prevent faults, researchers have developed safety standards, safety analysis techniques, and fault-tolerant techniques; however, there are still no methodology frameworks for verifying safety-critical software systems. This research’s methodology combines software-safety methods into a comprehensive whole for the purpose of verifying safety-critical software systems. This research concentrated on developing a methodology framework that combines static-verification, dynamic-verification, and fault-tolerant concepts for verifying safety-critical software systems

    Quality assurance in agile safety-critical systems development

    Full text link
    © 2016 IEEE. In this position paper we examine how safety could be assured when increasingly complex systems are developed using agile software development methods. We first discuss the source and nature of complexity in software systems and how a probe - sense - learn approach recommended by the Cynefin Framework is appropriate for designing complex systems and a sense - analyse - learn approach is appropriate for developing a complicated system whose design has been determined. We then examine how quality assurance is incorporated into agile software development before pointing out that those characteristics of a self-managed team that produce so many benefits for software development of complex systems whose solution evolves with problem understanding, are also vulnerable to confirmation bias. This suggests that for safety critical system development, software systems developed by agile teams will need verification and validation by independent parties. We review current quality management practices for medical device software development before discussing how our earlier findings could be adopted into safety critical software quality management

    Architecture framework for software safety

    Get PDF
    Currently, an increasing number of systems are controlled by soft- ware and rely on the correct operation of software. In this context, a safety- critical system is defined as a system in which malfunctioning software could result in death, injury or damage to environment. To mitigate these serious risks, the architecture of safety-critical systems needs to be carefully designed and analyzed. A common practice for modeling software architecture is the adoption of software architecture viewpoints to model the architecture for par- ticular stakeholders and concerns. Existing architecture viewpoints tend to be general purpose and do not explicitly focus on safety concerns in particular. To provide a complementary and dedicated support for designing safety critical systems, we propose an architecture framework for software safety. The archi- tecture framework is based on a metamodel that has been developed after a tho- rough domain analysis. The framework includes three coherent viewpoints, each of which addressing an important concern. The application of the view- points is illustrated for an industrial case of safety-critical avionics control computer system. © Springer International Publishing Switzerland 2014

    Positioning Verfification in the Context of Software/System Certification

    Get PDF
    Formal verification applied to software has been seen as an important focus in research for determining the acceptability of that software for use. However, in examining the requirements for determining the safety of a software intensive system for use in critical situations, it is quite clear that verification plays a role,but not necessarily a central role. It is entirely possible that a piece of software satisfies its specification, but is unsafe to use. (The first and foremost reason for this is that the program satisfies an unsafe specification.) In this paper we will address the nature of certification in the context of critical systems, decomposing it,by means of a new philosophical framework, into four aspects: evidence, confidence, determination and certification. Our point of view is that establishing the safety (in a very general sense) of a system is a confidence building exercise much in the same vein as the scientific method; our framework serves as a setting in which we can properly understand and develop such an exercise. We will then place formal verification and assurance cases in this setting, discussing their roles and limitations.Keywords: Software certification, System certification, Formal specification, Verification,Critical systems, Safety, Assurance cases, Safety case

    A framework for software reuse in safety-critical system of systems

    Get PDF
    This thesis concerns the effective and safe software reuse in safety-critical system-of-systems. Software reuse offers many unutilized benefits such as achieving rapid system development, saving resources and time, and keeping up technologically in an increasingly advancing global environment. System software needs to be designed for both reuse and safety and available information shared effectively. We introduce a process neutral framework for software reuse in safety-critical system of systems. That framework consists of four elements: organizational factors, component attributes, component specification, and safety analysis. We developed a model (C5RA) to capture the relevant component information and assist in specification matching. We conducted a survey of software safety metrics, created metrics, and developed a ranking. We applied the framework utilizing the reuse of a generic avionics software component. Our key findings are that congruence between all elements is required; software should posses certain attributes with metrics that support a safe design; software component information can be specified using C5RA; and a process was identified for a system-of-systems hazard analysis for software reuse. The framework outlined provides a solution that enables effective software reuse in safety-critical system of systems.http://archive.org/details/aframeworkforsof109454142Australian Army author.Approved for public release; distribution is unlimited

    Architecture-driven fault-based testing for software safety

    Get PDF
    Ankara : The Department of Computer Engineering and the Graduate School of Engineering and Science of Bilkent University, 2014.Thesis (Master's) -- Bilkent University, 2014.Includes bibliographical references leaves 159-166.A safety-critical system is defined as a system in which the malfunctioning of software could result in death, injury or damage to environment. To mitigate these serious risks the architecture of safety-critical systems need to be carefully designed and analyzed. A common practice for modeling software architecture is the adoption of architectural perspectives and software architecture viewpoint approaches. Existing approaches tend to be general purpose and do not explicitly focus on safety concern in particular. To provide a complementary and dedicated support for designing safety-critical systems we propose safety perspective and an architecture framework approach for software safety. Once the safety-critical systems are designed it is important to analyze these for fitness before implementation, installation and operation. Hereby, it is important to ensure that the potential faults can be identified and cost-effective solutions are provided to avoid or recover from the failures. In this context, one of the most important issues is to investigate the effectiveness of the applied safety tactics to safety-critical systems. Since the safety-critical systems are complex systems, testing of these systems is challenging and very hard to define proper test suites for these systems. Several fault-based software testing approaches exist that aim to analyze the quality of the test suites. Unfortunately, these approaches do not directly consider safety concern and tend to be general purpose and they doesn’t consider the applied the safety tactics. We propose a fault-based testing approach for analyzing the test suites using the safety tactic and fault knowledge.Gürbüz, Havva GülayM.S

    Formal verification of automotive embedded UML designs

    Get PDF
    Software applications are increasingly dominating safety critical domains. Safety critical domains are domains where the failure of any application could impact human lives. Software application safety has been overlooked for quite some time but more focus and attention is currently directed to this area due to the exponential growth of software embedded applications. Software systems have continuously faced challenges in managing complexity associated with functional growth, flexibility of systems so that they can be easily modified, scalability of solutions across several product lines, quality and reliability of systems, and finally the ability to detect defects early in design phases. AUTOSAR was established to develop open standards to address these challenges. ISO-26262, automotive functional safety standard, aims to ensure functional safety of automotive systems by providing requirements and processes to govern software lifecycle to ensure safety. Each functional system needs to be classified in terms of safety goals, risks and Automotive Safety Integrity Level (ASIL: A, B, C and D) with ASIL D denoting the most stringent safety level. As risk of the system increases, ASIL level increases and the standard mandates more stringent methods to ensure safety. ISO-26262 mandates that ASILs C and D classified systems utilize walkthrough, semi-formal verification, inspection, control flow analysis, data flow analysis, static code analysis and semantic code analysis techniques to verify software unit design and implementation. Ensuring software specification compliance via formal methods has remained an academic endeavor for quite some time. Several factors discourage formal methods adoption in the industry. One major factor is the complexity of using formal methods. Software specification compliance in automotive remains in the bulk heavily dependent on traceability matrix, human based reviews, and testing activities conducted on either actual production software level or simulation level. ISO26262 automotive safety standard recommends, although not strongly, using formal notations in automotive systems that exhibit high risk in case of failure yet the industry still heavily relies on semi-formal notations such as UML. The use of semi-formal notations makes specification compliance still heavily dependent on manual processes and testing efforts. In this research, we propose a framework where UML finite state machines are compiled into formal notations, specification requirements are mapped into formal model theorems and SAT/SMT solvers are utilized to validate implementation compliance to specification. The framework will allow semi-formal verification of AUTOSAR UML designs via an automated formal framework backbone. This semi-formal verification framework will allow automotive software to comply with ISO-26262 ASIL C and D unit design and implementation formal verification guideline. Semi-formal UML finite state machines are automatically compiled into formal notations based on Symbolic Analysis Laboratory formal notation. Requirements are captured in the UML design and compiled automatically into theorems. Model Checkers are run against the compiled formal model and theorems to detect counterexamples that violate the requirements in the UML model. Semi-formal verification of the design allows us to uncover issues that were previously detected in testing and production stages. The methodology is applied on several automotive systems to show how the framework automates the verification of UML based designs, the de-facto standard for automotive systems design, based on an implicit formal methodology while hiding the cons that discouraged the industry from using it. Additionally, the framework automates ISO-26262 system design verification guideline which would otherwise be verified via human error prone approaches

    Quid Pro Quod: Enhancing Patient Safety Via Minimizing Human-Computer Interactions Errors

    Get PDF
    The present describes an initial research project aiming at enhancing pa- tient safety. The overall goal is to minimize human-computer interactions errors that may occur via the use of Medical Information Systems (MIS) in health care units. The main idea is to extend the approach on design of usability and safety issues of generic medical devices, or safety critical systems design, to the problem domain of patient safety in the design of MIS. An understanding of errors and patient safety issues is presented and how these issues contribute to interaction errors in MIS. A plan of the research programm and related questions is presented. Is is expected that the outcome of a case study will be used for testing an evaluation framework, in development, that will take into account a rapid method for improving these aspects regarding the software development process
    corecore