3,139 research outputs found

    A voyage to Mars: A challenge to collaboration between man and machines

    Get PDF
    A speech addressing the design of man machine systems for exploration of space beyond Earth orbit from the human factors perspective is presented. Concerns relative to the design of automated and intelligent systems for the NASA Space Exploration Initiative (SEI) missions are largely based on experiences with integrating humans and comparable systems in aviation. The history, present status, and future prospect, of human factors in machine design are discussed in relation to a manned voyage to Mars. Three different cases for design philosophy are presented. The use of simulation is discussed. Recommendations for required research are given

    Learning from Automation Surprises and "Going Sour" Accidents: Progress on Human-Centered Automation

    Get PDF
    Advances in technology and new levels of automation on commercial jet transports has had many effects. There have been positive effects from both an economic and a safety point of view. The technology changes on the flight deck also have had reverberating effects on many other aspects of the aviation system and different aspects of human performance. Operational experience, research investigations, incidents, and occasionally accidents have shown that new and sometimes surprising problems have arisen as well. What are these problems with cockpit automation, and what should we learn from them? Do they represent over-automation or human error? Or instead perhaps there is a third possibility - they represent coordination breakdowns between operators and the automation? Are the problems just a series of small independent glitches revealed by specific accidents or near misses? Do these glitches represent a few small areas where there are cracks to be patched in what is otherwise a record of outstanding designs and systems? Or do these problems provide us with evidence about deeper factors that we need to address if we are to maintain and improve aviation safety in a changing world? How do the reverberations of technology change on the flight deck provide insight into generic issues about developing human-centered technologies and systems (Winograd and Woods, 1997)? Based on a series of investigations of pilot interaction with cockpit automation (Sarter and Woods, 1992; 1994; 1995; 1997a, 1997 b), supplemented by surveys, operational experience and incident data from other studies (e.g., Degani et al., 1995; Eldredge et al., 1991; Tenney et al., 1995; Wiener, 1989), we too have found that the problems that surround crew interaction with automation are more than a series of individual glitches. These difficulties are symptoms that indicate deeper patterns and phenomena concerning human-machine cooperation and paths towards disaster. In addition, we find the same kinds of patterns behind results from studies of physician interaction with computer-based systems in critical care medicine (e.g., Moll van Charante et al., 1993; Obradovich and Woods, 1996; Cook and Woods, 1996). Many of the results and implications of this kind of research are synthesized and discussed in two comprehensive volumes, Billings (1996) and Woods et al. (1994). This paper summarizes the pattern that has emerged from our research, related research, incident reports, and accident investigations. It uses this new understanding of why problems arise to point to new investment strategies that can help us deal with the perceived "human error" problem, make automation more of a team player, and maintain and improve safety

    Assessing V and V Processes for Automation with Respect to Vulnerabilities to Loss of Airplane State Awareness

    Get PDF
    Automation has contributed substantially to the sustained improvement of aviation safety by minimizing the physical workload of the pilot and increasing operational efficiency. Nevertheless, in complex and highly automated aircraft, automation also has unintended consequences. As systems become more complex and the authority and autonomy (A&A) of the automation increases, human operators become relegated to the role of a system supervisor or administrator, a passive role not conducive to maintaining engagement and airplane state awareness (ASA). The consequence is that flight crews can often come to over rely on the automation, become less engaged in the human-machine interaction, and lose awareness of the automation mode under which the aircraft is operating. Likewise, the complexity of the system and automation modes may lead to poor understanding of the interaction between a mode of automation and a particular system configuration or phase of flight. These and other examples of mode confusion often lead to mismanaging the aircraft"TM"s energy state or the aircraft deviating from the intended flight path. This report examines methods for assessing whether, and how, operational constructs properly assign authority and autonomy in a safe and coordinated manner, with particular emphasis on assuring adequate airplane state awareness by the flight crew and air traffic controllers in off-nominal and/or complex situations

    Assessment of the State-of-the-Art of System-Wide Safety and Assurance Technologies

    Get PDF
    Since its initiation, the System-wide Safety Assurance Technologies (SSAT) Project has been focused on developing multidisciplinary tools and techniques that are verified and validated to ensure prevention of loss of property and life in NextGen and enable proactive risk management through predictive methods. To this end, four technical challenges have been listed to help realize the goals of SSAT, namely (i) assurance of flight critical systems, (ii) discovery of precursors to safety incidents, (iii) assuring safe human-systems integration, and (iv) prognostic algorithm design for safety assurance. The objective of this report is to provide an extensive survey of SSAT-related research accomplishments by researchers within and outside NASA to get an understanding of what the state-of-the-art is for technologies enabling each of the four technical challenges. We hope that this report will serve as a good resource for anyone interested in gaining an understanding of the SSAT technical challenges, and also be useful in the future for project planning and resource allocation for related research

    Autonomous, Context-Sensitive, Task Management Systems and Decision Support Tools I: Human-Autonomy Teaming Fundamentals and State of the Art

    Get PDF
    Recent advances in artificial intelligence, machine learning, data mining and extraction, and especially in sensor technology have resulted in the availability of a vast amount of digital data and information and the development of advanced automated reasoners. This creates the opportunity for the development of a robust dynamic task manager and decision support tool that is context sensitive and integrates information from a wide array of on-board and off aircraft sourcesa tool that monitors systems and the overall flight situation, anticipates information needs, prioritizes tasks appropriately, keeps pilots well informed, and is nimble and able to adapt to changing circumstances. This is the first of two companion reports exploring issues associated with autonomous, context-sensitive, task management and decision support tools. In the first report, we explore fundamental issues associated with the development of an integrated, dynamic, flight information and automation management system. We discuss human factors issues pertaining to information automation and review the current state of the art of pilot information management and decision support tools. We also explore how effective human-human team behavior and expectations could be extended to teams involving humans and automation or autonomous systems

    Human Factors Certification of Advanced Aviation Technologies

    Get PDF
    Proceedings of the Human Factors Certification of Advanced Aviation Technologies Conference held at the Chateau de Bonas, near Toulouse, France, 19-23 July 1993

    Cognitive engineering in aerospace application: Pilot interaction with cockpit automation

    Get PDF
    Because of recent incidents involving glass-cockpit aircraft, there is growing concern with cockpit automation and its potential effects on pilot performance. However, little is known about the nature and causes of problems that arise in pilot-automation interaction. The results of two studies that provide converging, complementary data on pilots' difficulties with understanding and operating one of the core systems of cockpit automation, the Flight Management System (FMS) is reported. A survey asking pilots to describe specific incidents with the FMS and observations of pilots undergoing transition training to a glass cockpit aircraft served as vehicles to gather a corpus on the nature and variety of FMS-related problems. The results of both studies indicate that pilots become proficient in standard FMS operations through ground training and subsequent line experience. But even with considerable line experience, they still have difficulties tracking FMS status and behavior in certain flight contexts, and they show gaps in their understanding of the functional structure of the system. The results suggest that design-related factors such as opaque interfaces contribute to these difficulties which can affect pilots' situation awareness. The results of this research are relevant for both the design of cockpit automation and the development of training curricula specifically tailored to the needs of glass cockpits

    Some Challenges in the Design of Human-Automation Interaction for Safety-Critical Systems

    Get PDF
    Increasing amounts of automation are being introduced to safety-critical domains. While the introduction of automation has led to an overall increase in reliability and improved safety, it has also introduced a class of failure modes, and new challenges in risk assessment for the new systems, particularly in the assessment of rare events resulting from complex inter-related factors. Designing successful human-automation systems is challenging, and the challenges go beyond good interface development (e.g., Roth, Malin, & Schreckenghost 1997; Christoffersen & Woods, 2002). Human-automation design is particularly challenging when the underlying automation technology generates behavior that is difficult for the user to anticipate or understand. These challenges have been recognized in several safety-critical domains, and have resulted in increased efforts to develop training, procedures, regulations and guidance material (CAST, 2008, IAEA, 2001, FAA, 2013, ICAO, 2012). This paper points to the continuing need for new methods to describe and characterize the operational environment within which new automation concepts are being presented. We will describe challenges to the successful development and evaluation of human-automation systems in safety-critical domains, and describe some approaches that could be used to address these challenges. We will draw from experience with the aviation, spaceflight and nuclear power domains
    • …
    corecore