38,445 research outputs found
Recommended from our members
Why Are People's Decisions Sometimes Worse with Computer Support?
In many applications of computerised decision support, a recognised source of undesired outcomes is operators' apparent over-reliance on automation. For instance, an operator may fail to react to a potentially dangerous situation because a computer fails to generate an alarm. However, the very use of terms like "over-reliance" betrays possible misunderstandings of these phenomena and their causes, which may lead to ineffective corrective action (e.g. training or procedures that do not counteract all the causes of the apparently "over-reliant" behaviour). We review relevant literature in the area of "automation bias" and describe the diverse mechanisms that may be involved in human errors when using computer support. We discuss these mechanisms, with reference to errors of omission when using "alerting systems", with the help of examples of novel counterintuitive findings we obtained from a case study in a health care application, as well as other examples from the literature
Current Concepts and Trends in Human-Automation Interaction
Dieser Beitrag ist mit Zustimmung des Rechteinhabers aufgrund einer (DFG geförderten) Allianz- bzw. Nationallizenz frei zugänglich.This publication is with permission of the rights owner freely accessible due to an Alliance licence and a national licence (funded by the DFG, German Research Foundation) respectively.The purpose of this panel was to provide a general overview and discussion of some of the most current and controversial concepts and trends in human-automation interaction. The panel was composed of eight researchers and practitioners. The panelists are well-known experts in the area and offered differing views on a variety of different human-automation topics. The range of concepts and trends discussed in this panel include: general taxonomies regarding stages and levels of automation and function allocation, individualized adaptive automation, automation-induced complacency, economic rationality and the use of automation, the potential utility of false alarms, the influence of different types of false alarms on trust and reliance, and a system-wide theory of trust in multiple automated aids
A proposed psychological model of driving automation
This paper considers psychological variables pertinent to driver automation. It is anticipated that driving with automated systems is likely to have a major impact on the drivers and a multiplicity of factors needs to be taken into account. A systems analysis of the driver, vehicle and automation served as the basis for eliciting psychological factors. The main variables to be considered were: feed-back, locus of control, mental workload, driver stress, situational awareness and mental representations. It is expected that anticipating the effects on the driver brought about by vehicle automation could lead to improved design strategies. Based on research evidence in the literature, the psychological factors were assembled into a model for further investigation
Human-automation collaboration in manufacturing: identifying key implementation factors
Human-automation collaboration refers to the concept of human operators and intelligent automation working together interactively within the same workspace without conventional physical separation. This concept has commanded significant attention in manufacturing because of the potential applications, such as the installation of large sub-assemblies. However, the key human factors relevant to human-automation collaboration have not yet been fully investigated. To maximise effective implementation and reduce development costs for future projects these factors need to be examined. In this paper, a collection of human factors likely to influence human-automation collaboration are identified from current literature. To test the validity of these and explore further factors associated with implementation success, different types of production processes in terms of stage of maturity are being explored via industrial case studies from the project’s stakeholders. Data was collected through a series of semi-structured interviews with shop floor operators, engineers, system designers and management personnel
Recommended from our members
Human Factors Standards and the Hard Human Factor Problems: Observations on Medical Usability Standards
With increasing variety and sophistication of computer-based medical devices, and more diverse users and use environments, usability is essential, especially to ensure safety. Usability standards and guidelines play an important role. We reviewed several, focusing on the IEC 62366 and 60601 sets. It is plausible that these standards have reduced risks for patients, but we raise concerns regarding: (1) complex design trade-offs that are not addressed, (2) a focus on user interface design (e.g., making alarms audible) to the detriment of other human factors (e.g., ensuring users actually act upon alarms they hear), and (3) some definitions and scope restrictions that may create “blind spots”. We highlight potential related risks, e.g. that clear directives on “easier to understand” risks, though useful, may preclude mitigating other, more “difficult” ones; but ask to what extent these negative effects can be avoided by standard writers, given objective constraints. Our critique is motivated by current research and incident reports, and considers standards from other domains and countries. It is meant to highlight problems, relevant to designers, standards committees, and human factors researchers, and to trigger discussion about the potential and limits of standards
Risk Management in the Arctic Offshore: Wicked Problems Require New Paradigms
Recent project-management literature and high-profile disasters—the financial crisis, the BP
Deepwater Horizon oil spill, and the Fukushima nuclear accident—illustrate the flaws of
traditional risk models for complex projects. This research examines how various groups with
interests in the Arctic offshore define risks. The findings link the wicked problem framework and
the emerging paradigm of Project Management of the Second Order (PM-2). Wicked problems
are problems that are unstructured, complex, irregular, interactive, adaptive, and novel. The
authors synthesize literature on the topic to offer strategies for navigating wicked problems,
provide new variables to deconstruct traditional risk models, and integrate objective and
subjective schools of risk analysis
ATM automation: guidance on human technology integration
© Civil Aviation Authority 2016Human interaction with technology and automation is a key area of interest to industry and safety regulators alike. In February 2014, a joint CAA/industry workshop considered perspectives on present and future implementation of advanced automated systems. The conclusion was that whilst no additional regulation was necessary, guidance material for industry and regulators was required. Development of this guidance document was completed in 2015 by a working group consisting of CAA, UK industry, academia and industry associations (see Appendix B). This enabled a collaborative approach to be taken, and for regulatory, industry, and workforce perspectives to be collectively considered and addressed. The processes used in developing this guidance included: review of the themes identified from the February 2014 CAA/industry workshop1; review of academic papers, textbooks on automation, incidents and accidents involving automation; identification of key safety issues associated with automated systems; analysis of current and emerging ATM regulatory requirements and guidance material; presentation of emerging findings for critical review at UK and European aviation safety conferences. In December 2015, a workshop of senior management from project partner organisations reviewed the findings and proposals. EASA were briefed on the project before its commencement, and Eurocontrol contributed through membership of the Working Group.Final Published versio
Advanced Techniques for Assets Maintenance Management
16th IFAC Symposium on Information Control Problems in Manufacturing INCOM 2018
Bergamo, Italy, 11–13 June 2018. Edited by Marco Macchi, László Monostori, Roberto PintoThe aim of this paper is to remark the importance of new and advanced techniques supporting decision making in different business processes for maintenance and assets management, as well as the basic need of adopting a certain management framework with a clear processes map and the corresponding IT supporting systems. Framework processes and systems will be the key fundamental enablers for success and for continuous improvement. The suggested framework will help to define and improve business policies and work procedures for the assets operation and maintenance along their life cycle. The following sections present some achievements on this focus, proposing finally possible future lines for a research agenda within this field of assets management
- …