100,514 research outputs found

    Future OE Mission Command and Future OE Decision Cycles

    Get PDF
    Enormous commercial, academic, and governmental resources are being expended to build machines which can autonomously assist humans in a variety of complex tasks (e.g., drive cars, fly aircraft, engage targets, manage distributed operations). This post asserts that the technologies being developed and deployed by these efforts will eventually force future mission command capabilities to include abilities to detect, analyze, and react to man-machine interface deception / surprise events at all echelons of command. The need for these new / improved decision support capabilities will be driven by the challenges of creating accurate Intelligence, Surveillance, and Reconnaissance (ISR) estimates while encountering increased deception / surprise technologies. These deception technologies are appearing at every echelon of mission command and are being driven, in part, by the ongoing commercial integration of the international network of Information Technology (IT) systems and the international network of Operational Technology (OT) systems. A lesson learned from the use of the Stuxnet malware to cause Iranian centrifuges to self-destruct is that malware can be used to achieve tactical surprise of human operators. The centrifuge control man-machine interface was exploited to deceive human operators concerning the true state of the autonomous control system as the machines were being commanded to destroy themselves. The Iranian operators were unaware for a lengthy period that they were being deceived by their monitoring software and they were surprised when they discovered the extent of the damage to the centrifuges. The centrifuge-control, man-machine interface was informing the human operators that everything was proceeding as commanded when in fact the machines were shaking themselves apart. It is apparent from many recent events/results that similar outcomes are now possible at each echelon of command (individual deception outcomes at the “tip of the spear,” as well as tactical surprise outcomes, operational surprise outcomes, and strategic surprise outcomes). This note provides a summary of some results in achieving distributed state estimation and control of complex, networked systems. This post asserts that a wide variety of distributed control systems, including national infrastructure systems and possibly military command and control systems are subject to deliberate and inadvertent cyber and physical anomalies (failure modes) and states the author’s opinions regarding the implications of the ongoing integration of IT and OT for future Mission Command decisions and future Operational Environment (OE) state estimation results

    Personal Assistance using Artificial Intelligence for Computers

    Get PDF
    Machine makes life easier so men always keen to develop new machine and software which makes life easier. Since the invention of computers or machines, their capability to perform various tasks went on growing exponentially. Humans have developed the power of computer systems in terms of their diverse working domains, their increasing speed, and reducing size with respect to time. So the objective of the proposed work is to control the computer in easier way that is through the voice commands. The system is based on one of the major application of artificial intelligence “Speech Recognition”. This Software “Personal assistance for computer using artificial intelligence” can be used as personal assistance to user working in personal computer. Software with cognitive abilities similar to those of the human brain so that it can understand human language thinks, infer, reason and learn. It use the android application to take the input from user and the command given by the user will sent through the Bluetooth for the MATLAB interface in computer. The command is processed and the action for specific command is executed. So in simple way through voice command we can do the work in PC

    Stealthy Deception Attacks Against SCADA Systems

    Full text link
    SCADA protocols for Industrial Control Systems (ICS) are vulnerable to network attacks such as session hijacking. Hence, research focuses on network anomaly detection based on meta--data (message sizes, timing, command sequence), or on the state values of the physical process. In this work we present a class of semantic network-based attacks against SCADA systems that are undetectable by the above mentioned anomaly detection. After hijacking the communication channels between the Human Machine Interface (HMI) and Programmable Logic Controllers (PLCs), our attacks cause the HMI to present a fake view of the industrial process, deceiving the human operator into taking manual actions. Our most advanced attack also manipulates the messages generated by the operator's actions, reversing their semantic meaning while causing the HMI to present a view that is consistent with the attempted human actions. The attacks are totaly stealthy because the message sizes and timing, the command sequences, and the data values of the ICS's state all remain legitimate. We implemented and tested several attack scenarios in the test lab of our local electric company, against a real HMI and real PLCs, separated by a commercial-grade firewall. We developed a real-time security assessment tool, that can simultaneously manipulate the communication to multiple PLCs and cause the HMI to display a coherent system--wide fake view. Our tool is configured with message-manipulating rules written in an ICS Attack Markup Language (IAML) we designed, which may be of independent interest. Our semantic attacks all successfully fooled the operator and brought the system to states of blackout and possible equipment damage

    Community-of-Interest (COI) Model-Based Languages Enabling Composable Net-Centric Services

    Get PDF
    Net-centric services shall be designed to collaborate with other services used within the supported Community of Interest (COI). This requires that such services not only be integratable on the technical level and interoperable on the implementation level, but also that they are composable in the sense that they are semantically and pragmatically consistent and able to exchange information in a consistent and unambiguous way. In order to support Command-and-Control with Composable Net-centric Services, the human-machine interoperation must be supported as well as the machine-machine interoperation. This paper shows that techniques of computer linguistic can support the human-machine interface by structuring human-oriented representations into machine-oriented regular expressions that implement the unambiguous data exchange between machines. Distinguishing between these two domains is essential, as some requirements are mutually exclusive. In order to get the best of both worlds, an aligned approach based on a COI model is needed. This COI model starts with the partners and their respective services and business processes, identifies the resulting infrastructure components, and derives the information exchange requirements. Model-based Data Engineering leads to the configuration of data exchange specifications between the services in form of an artificial language comprising regular expressions for the machine-machine communication. Computer linguistic methods are applied to accept and generate human-oriented representations, which potentially extend the information exchange specifications to capture new information not represented in the system requirements. The paper presents the framework that was partially applied for homeland security applications and in support of the joint rapid scenario generation activities of US Joint Forces Command.

    A Study of Human-Machine Interface (HMI) Learnability for Unmanned Aircraft Systems Command and Control

    Get PDF
    The operation of sophisticated unmanned aircraft systems (UAS) involves complex interactions between human and machine. Unlike other areas of aviation where technological advancement has flourished to accommodate the modernization of the National Airspace System (NAS), the scientific paradigm of UAS and UAS user interface design has received little research attention and minimal effort has been made to aggregate accurate data to assess the effectiveness of current UAS human-machine interface (HMI) representations for command and control. UAS HMI usability is a primary human factors concern as the Federal Aviation Administration (FAA) moves forward with the full-scale integration of UAS in the NAS by 2025. This study examined system learnability of an industry standard UAS HMI as minimal usability data exists to support the state-of-the art for new and innovative command and control user interface designs. This study collected data as it pertained to the three classes of objective usability measures as prescribed by the ISO 9241-11. The three classes included: (1) effectiveness, (2) efficiency, and (3) satisfaction. Data collected for the dependent variables incorporated methods of video and audio recordings, a time stamped simulator data log, and the SUS survey instrument on forty-five participants with none to varying levels of conventional flight experience (i.e., private pilot and commercial pilot). The results of the study suggested that those individuals with a high level of conventional flight experience (i.e., commercial pilot certificate) performed most effectively when compared to participants with low pilot or no pilot experience. The one-way analysis of variance (ANOVA) computations for completion rates revealed statistical significance for trial three between subjects [F (2, 42) = 3.98, p = 0.02]. Post hoc t-test using a Bonferroni correction revealed statistical significance in completion rates [t (28) = -2.92, p\u3c0.01] between the low pilot experience group (M = 40%, SD =. 50) and high experience group (M = 86%, SD = .39). An evaluation of error rates in parallel with the completion rates for trial three also indicated that the high pilot experience group committed less errors (M = 2.44, SD = 3.9) during their third iteration when compared to the low pilot experience group (M = 9.53, SD = 12.63) for the same trial iteration. Overall, the high pilot experience group (M = 86%, SD = .39) performed better than both the no pilot experience group (M = 66%, SD = .48) and low pilot experience group (M = 40%, SD =.50) with regard to task success and the number of errors committed. Data collected using the SUS measured an overall composite SUS score (M = 67.3, SD = 21.0) for the representative HMI. The subscale scores for usability and learnability were 69.0 and 60.8, respectively. This study addressed a critical need for future research in the domain of UAS user interface designs and operator requirements as the industry is experiencing revolutionary growth at a very rapid rate. The deficiency in legislation to guide the scientific paradigm of UAS has generated significant discord within the industry leaving many facets associated with the teleportation of these systems in dire need of research attention. Recommendations for future work included a need to: (1) establish comprehensive guidelines and standards for airworthiness certification for the design and development of UAS and UAS HMI for command and control, (2) establish comprehensive guidelines to classify the complexity associated with UAS systems design, (3) investigate mechanisms to develop comprehensive guidelines and regulations to guide UAS operator training, (4) develop methods to optimize UAS interface design through automation integration and adaptive display technologies, and (5) adopt methods and metrics to evaluate human-machine interface related to UAS applications for system usability and system learnability

    An Experimental Platform for Investigating Decision and Collaboration Technologies in Time-Sensitive Mission Control Operations

    Get PDF
    This report describes the conceptual design and detailed architecture of an experimental platform developed to support investigations of novel decision and collaboration technologies for complex, time-critical mission control operations, such as military command and control and emergency response. In particular, the experimental platform is designed to enable exploration of novel interface and interaction mechanisms to support both human-human collaboration and human-machine collaboration for mission control operations involving teams of human operators engaged in supervisory control of intelligent systems, such as unmanned aerial vehicles (UAVs). Further, the experimental platform is designed to enable both co-located and distributed collaboration among operations team members, as well as between team members and relevant mission stakeholders. To enable initial investigations of new information visualization, data fusion, and data sharing methods, the experimental platform provides a synthetic task environment for a representative collaborative time-critical mission control task scenario. This task scenario involves a UAV operations team engaged in intelligence, surveillance, and reconnaissance (ISR) activities. In the experimental task scenario, the UAV team consists of one mission commander and three operators controlling multiple, homogeneous, semi-autonomous UAVs. In order to complete its assigned missions, the UAV team must coordinate with a ground convoy, an external strike team, and a local joint surveillance and target attack radar system (JSTARS). This report details this task scenario, including the possible simulation events that can occur and the logic governing the simulation dynamics. In order to perform human-in-the-loop experimentation within the synthetic task environment, the experimental platform also consists of a physical laboratory designed to emulate a miniature command center. The Command Center Laboratory comprises a number of large-screen displays, multi-screen operator stations, and mobile, tablet-style devices. This report details the physical configuration and hardware components of this Command Center Laboratory. Details are also provided of the software architecture used to implement the synthetic task environment and experimental interface technologies to facilitate user experiments in this laboratory. The report also summarizes the process of conducting an experiment in the experimental platform, including details of scenario design, hardware and software instrumentation, and participant training. Finally, the report suggests several improvements that could be made to the experimental platform based on insights gained from initial user experiments that have been conducted in this environment.Prepared For Boeing, Phantom Work

    Enhancing manual flight precision and reducing pilot workload using a new manual control augmentation system for energy angle

    Get PDF
    With rising demands on flight precision and more complex flight trajectories, pilots' workload during manual flight is increasing. This is especially the case for thrust and spoiler control during approach and landing. The presented nxControl system enables pilots to manually control the longitudinal load factor nx instead of engine parameters and spoiler deflections. This load factor is equivalent to total energy angle and is directly influenced by engine thrust and aerodynamic drag. The nxController complements existing control augmentation systems such as the fly-by-wire control laws of today's commercial airliners. It aims at higher precision with lower workload during manual flight. The controller input can be set and monitored by an adapted human-machine interface consisting of a thrust-lever-like inceptor and additional display elements to enhance energy awareness. This paper presents the nxControl system with focus on the command control system and an evaluation study with 24 airline pilots in a research flight simulator. The task was a demanding and steep approach with required navigation performance RNP 0.1 in a mountainous area. The results show higher precision and lower workload with the nxControl system despite minimal amount of training

    A Study of Human-Machine Interface (HMI) Learnability for Unmanned Aircraft Systems Command and Control

    Get PDF
    The operation of sophisticated unmanned aircraft systems (UAS) involves complex interactions between human and machine. Unlike other areas of aviation where technological advancement has flourished to accommodate the modernization of the National Airspace System (NAS), the scientific paradigm of UAS user interface design has received little research attention. This study examined system learnability of an industry standard UAS HMI as minimal usability data exists to support the state-of-the art for innovative command and control user interface designs. Data collected pertained to the three classes of objective usability measures as prescribed by the ISO 9241-11. The three classes included: (1) effectiveness, (2) efficiency, and (3) satisfaction. The System Usability Scale (SUS) survey instrument was also incorporated in a post-hoc fashion across forty-five participants with none to varying levels of conventional flight experience. The one-way analysis of variance (ANOVA) computations for completion rates revealed statistical significance for trial three between subjects [F (2, 42) = 3.98, p = 0.02]. Post hoc t-test using a Bonferroni correction revealed statistical significance in completion rates [t (28) = -2.92,

    Vision-based gesture recognition system for human-computer interaction

    Get PDF
    Hand gesture recognition, being a natural way of human computer interaction, is an area of active research in computer vision and machine learning. This is an area with many different possible applications, giving users a simpler and more natural way to communicate with robots/systems interfaces, without the need for extra devices. So, the primary goal of gesture recognition research is to create systems, which can identify specific human gestures and use them to convey information or for device control. This work intends to study and implement a solution, generic enough, able to interpret user commands, composed of a set of dynamic and static gestures, and use those solutions to build an application able to work in a realtime human-computer interaction systems. The proposed solution is composed of two modules controlled by a FSM (Finite State Machine): a real time hand tracking and feature extraction system, supported by a SVM (Support Vector Machine) model for static hand posture classification and a set of HMMs (Hidden Markov Models) for dynamic single stroke hand gesture recognition. The experimental results showed that the system works very reliably, being able to recognize the set of defined commands in real-time. The SVM model for hand posture classification, trained with the selected hand features, achieved an accuracy of 99,2%. The proposed solution as the advantage of being computationally simple to train and use, and at the same time generic enough, allowing its application in any robot/system command interface

    Hierarchical Control of the ATLAS Experiment

    Get PDF
    Control systems at High Energy Physics (HEP) experiments are becoming increasingly complex mainly due to the size, complexity and data volume associated to the front-end instrumentation. In particular, this becomes visible for the ATLAS experiment at the LHC accelerator at CERN. ATLAS will be the largest particle detector ever built, result of an international collaboration of more than 150 institutes. The experiment is composed of 9 different specialized sub-detectors that perform different tasks and have different requirements for operation. The system in charge of the safe and coherent operation of the whole experiment is called Detector Control System (DCS). This thesis presents the integration of the ATLAS DCS into a global control tree following the natural segmentation of the experiment into sub-detectors and smaller sub-systems. The integration of the many different systems composing the DCS includes issues such as: back-end organization, process model identification, fault detection, synchronization with external systems, automation of processes and supervisory control. Distributed control modeling is applied to the widely distributed devices that coexist in ATLAS. Thus, control is achieved by means of many distributed, autonomous and co-operative entities that are hierarchically organized and follow a finite-state machine logic. The key to integration of these systems lies in the so called Finite State Machine tool (FSM), which is based on two main enabling technologies: a SCADA product, and the State Manager Interface (SMI++) toolkit. The SMI++ toolkit has been already used with success in two previous HEP experiments providing functionality such as: an object-oriented language, a finite-state machine logic, an interface to develop expert systems, and a platform-independent communication protocol. This functionality is then used at all levels of the experiment operation process, ranging from the overall supervision down to device integration, enabling the overall sequencing and automation of the experiment. Although the experience gained in the past is an important input for the design of the detector's control hierarchy, further requirements arose due to the complexity and size of ATLAS. In total, around 200.000 channels will be supervised by the DCS and the final control tree will be hundreds of times bigger than any of the antecedents. Thus, in order to apply a hierarchical control model to the ATLAS DCS, a common approach has been proposed to ensure homogeneity between the large-scale distributed software ensembles of sub-detectors. A standard architecture and a human interface have been defined with emphasis on the early detection, monitoring and diagnosis of faults based on a dynamic fault-data mechanism. This mechanism relies on two parallel communication paths that manage the faults while providing a clear description of the detector conditions. The DCS information is split and handled by different types of SMI++ objects; whilst one path of objects manages the operational mode of the system, the other is to handle eventual faults. The proposed strategy has been validated through many different tests with positive results in both functionality and performance. This strategy has been successfully implemented and constitutes the ATLAS standard to build the global control tree. During the operation of the experiment, the DCS, responsible for the detector operation, must be synchronized with the data acquisition system which is in charge of the physics data taking process. The interaction between both systems has so far been limited, but becomes increasingly important as the detector nears completion. A prototype implementation, ready to be used during the sub-detector integration, has achieved data reconciliation by mapping the different segments of the data acquisition system into the DCS control tree. The adopted solution allows the data acquisition control applications to command different DCS sections independently and prevents incorrect physics data taking caused by a failure in a detector part. Finally, the human-machine interface presents and controls the DCS data in the ATLAS control room. The main challenges faced during the design and development phases were: how to support the operator in controlling this large system, how to maintain integration across many displays, and how to provide an effective navigation. These issues have been solved by combining the functionalities provided by both, the SCADA product and the FSM tool. The control hierarchy provides an intuitive structure for the organization of many different displays that are needed for the visualization of the experiment conditions. Each node in the tree represents a workspace that contains the functional information associated with its abstraction level within the hierarchy. By means of an effective navigation, any workspace of the control tree is accessible by the operator or detector expert within a common human interface layout. The interface is modular and flexible enough to be accommodated to new operational scenarios, fulfil the necessities of the different kind of users and facilitate the maintenance during the long lifetime of the detector of up to 20 years. The interface is in use since several months, and the sub-detector's control hierarchies, together with their associated displays, are currently being integrated into the common human-machine interface
    • …
    corecore