197 research outputs found
TBD
As a long time, System Safety engineer, working on major programs that implement system safety programs in accordance with Mil-Std-882, I understand that the topic of this post is rather controversial since it questions one of the main tenets of the profession – that a formal risk assessment based upon a pre-established Risk Assessment Matrix is a necessary part of the process
A New Approach to Hazard Analysis for Rotorcraft
STPA is a new hazard analysis technique that can identify more hazard causes than traditional techniques. It is based on the assumption that accidents result from unsafe control rather than component failures. To demonstrate and evaluate STPA for its application to rotorcraft, it was used to analyze the UH-60MU Warning, Caution, and Advisory (WCA) system associated with the electrical and fly-by-wire flight control system (FCS). STPA results were compared with an independently conducted hazard analysis of the UH-60MU using traditional safety processes described in SAE ARP 4761 and MIL-STD-882E. STPA found the same hazard causes as the traditional techniques and also identified things not found using traditional methods, including design flaws, human behavior, and component integration and interactions. The analysis includes organizational and physical components of systems and can be used to design safety into the system from the beginning of development while being compliant with MIL-STD-882. Copyright 2016 by American Helicopter Society International All right reserved
TBD
The Virtual Chapter of the International System Safety Society (ISSS) has had an interesting round of discussions during the past few months, some of which might be of interest to all members of the Society. These topics have been rattling around in my head since I was president of the Society in 1990, but have never seemed to get traction with others. I was thrown back into thinking about these issues at the last International System Safety Conference (ISSC) when I discovered that several years ago, the G48 Committee published a new “commercial version” of MIL-STD-882 and that this new version is now owned — and marketed — by SAE. Not only is the standard now the property of SAE, but so is the G48 Committee! SAE has begun advertising and promoting the idea that it, rather than the ISSS, is the owner and source for all things related to system safety engineering — selling the new standard, as well as papers written by folks who I consider to be key ISSS members, and providing training and workshops on system safety engineering
Understanding, Assessing, and Mitigating Safety Risks in Artificial Intelligence Systems
Prepared for: Naval Air Warfare Development Center (NAVAIR)Traditional software safety techniques rely on validating software against a deductively defined specification of how the software should behave in particular
situations. In the case of AI systems, specifications are often implicit or inductively defined. Data-driven methods are subject to sampling error since practical
datasets cannot provide exhaustive coverage of all possible events in a real physical environment. Traditional software verification and validation approaches may
not apply directly to these novel systems, complicating the operation of systems safety analysis (such as implemented in MIL-STD 882). However, AI offers
advanced capabilities, and it is desirable to ensure the safety of systems that rely on these capabilities. When AI tech is deployed in a weapon system, robot, or
planning system, unwanted events are possible. Several techniques can support the evaluation process for understanding the nature and likelihood of unwanted
events in AI systems and making risk decisions on naval employment. This research considers the state of the art, evaluating which ones are most likely to be
employable, usable, and correct. Techniques include software analysis, simulation environments, and mathematical determinations.Naval Air Warfare Development CenterNaval Postgraduate School, Naval Research Program (PE 0605853N/2098)Approved for public release. Distribution is unlimite
Recommended from our members
New safety model for the commercial human spaceflight industry
The aviation and space domains have safety guidelines and recommended practices for Design Organisations (DOs) and Operators alike. In terms of Aerospace DOs there are certification criteria to meet and to demonstrate compliance there are Advisory Circulars or Acceptable Means of Compliance to follow. Additionally there are guidelines such as Aerospace Recommended Practices (ARP), Military Standards (MIL-STD 882 series) and System Safety Handbooks to follow in order to identify and manage failure conditions. In terms of Operators there are FAA guidelines and a useful ARP that details many tools and techniques in understanding Operator Safety Risks. However there is currently no methodology for linking the DO and Operator safety efforts. In the space domain NASA have provided safety standards and guidelines to follow and also within Europe there are European Co-operation of Space Standardization (ECSS) to follow. Within the emerging Commercial Human Spaceflight Industry, the FAA’s Office of Commercial Space Transportation has provided hazard analysis guidelines. However all of these space domain safety documents are based on the existing aerospace methodology and once again, there is no link between the DO and Operator’s safety effort.
This paper addresses the problematic issue and presents a coherent methodology of joining up the System Safety effort of the DOs to the Operator Safety Risk Management such that a ‘Total System’ approach is adopted. Part of the rationale is that the correct mitigation (control) can be applied within the correct place in the accident sequence. Also this contiguous approach ensures that the Operator is fully aware of the safety risks (at the accident level) and therefore has an appreciation of the Total System Risk.
The authors of this paper contend that it is better practice to have a fully integrated safety model as opposed to disparate requirements or guidelines. Our methodology is firstly to review ‘best practice’ approaches from the aviation and space industries, and then to integrate these approaches into a contiguous safety model for the commercial human spaceflight industry
Lessons Learned in a Complex Software Safety Program
Development of a system software safety program was required as part of an effort to secure government safety certification of a complex and intrinsically hazardous software-controlled system under development by several contributing companies. The author was part of a team of software safety support engineers reporting to one of the contributing companies. This paper summarizes some of the highlights of the lessons learned during development of this program
Letters to the Editor
Software Safety vs Software Reliability
While looking back through Vol. 56, No. 1 (Summer 2020) of Journal of System Safety, I finally took the time to read Nathaniel Ozarin’s article “Lessons Learned in a Complex Software Safety Program.” The article is quite interesting and thought provoking, comparing what actually occurs while implementing a system safety program to the idealized descriptions found in documents such as MIL-STD-882, JSSSEH and AOP-52. While I found the article interesting and informative, I noted that the author consistently characterizes the “software safety problem” as a “reliability” problem, focused on finding and preventing “failures” and ensuring high “reliability.”
Some Thoughts on the Probabilistic Criteria for Ensuring Safe Airplane-System Designs
We have been employed in the risk sciences for a total of 86 years, including 62 years in reliability engineering and safety engineering positions at The Boeing Company. For many of those years, Yellman was the designated “Risk-Analysis Focal” (person) for Boeing’s 707, 727, 737 and 757 airplane models. For several decades, the United States government has published the same criteria, created by the U.S. Federal Aviation Administration (FAA), intended to ensure that the systems on large (transport-category) aircraft have been designed to be safe [Refs. 1 and 2]. But we believe that the criteria have failed to prevent certain aircraft accidents, and we think that the reasons for that should be better understood. We hope that this discussion will contribute to a better understanding by examining the part potentially played in those accidents by the FAA’s criteria that are defined probabilistically
TBD
During the past couple of years, I have been involved with things such as introducing system safety concepts into engineering courses. This, and other activities, has caused me to question what it is that makes the profession of system safety “special” — or at least different — from other approaches to achieving safety. My first reaction is that it is something you recognize when you see it. It usually takes only a quick review of a safety plan or effort to determine if it is a “system safety” effort. This isn’t always helpful when talking to those that haven’t “seen the light.” I wonder if there isn’t something fundamentally different between “traditional” safety (whatever that might be) and “system safety.
Assessing the Software Control Autonomy of System Functions in Safety-Critical Systems
Software Control Category (SCC) denotes the degree of control autonomy, command and control authority, and redundant fault tolerance software has over hazardous system functions of safety-critical systems. The use of SCC for determining the software contribution to system risks is a unique feature of the MIL-STD-882E System Safety Standard. A lower SCC designation means that the software system has a greater control autonomy over hazardous system functions, whereas SCC 1 means complete autonomous control. Software with greater control autonomy over hazardous system functions require greater effort to assure reliability and safety. Correct assessment of the SCC level of hazardous system functions is crucial for optimizing the safety property of a system developed under budget, schedule, and resource constraints. Beyond the categorical definitions provided by the MIL-STD-882E Standard, there is little information on conducting an SCC assessment. To close this knowledge gap, we present an SCC assessment method. Our paper will describe in detail the process and rules for assessing SCC. For illustration, we apply our method to assess the SCC of several safety-significant functions of an automobile’s brake-assist system
- …