18,653 research outputs found

    Human-Machine Interface for Remote Training of Robot Tasks

    Full text link
    Regardless of their industrial or research application, the streamlining of robot operations is limited by the proximity of experienced users to the actual hardware. Be it massive open online robotics courses, crowd-sourcing of robot task training, or remote research on massive robot farms for machine learning, the need to create an apt remote Human-Machine Interface is quite prevalent. The paper at hand proposes a novel solution to the programming/training of remote robots employing an intuitive and accurate user-interface which offers all the benefits of working with real robots without imposing delays and inefficiency. The system includes: a vision-based 3D hand detection and gesture recognition subsystem, a simulated digital twin of a robot as visual feedback, and the "remote" robot learning/executing trajectories using dynamic motion primitives. Our results indicate that the system is a promising solution to the problem of remote training of robot tasks.Comment: Accepted in IEEE International Conference on Imaging Systems and Techniques - IST201

    Gloved Human-Machine Interface

    Get PDF
    Certain exemplary embodiments can provide a system, machine, device, manufacture, circuit, composition of matter, and/or user interface adapted for and/or resulting from, and/or a method and/or machine-readable medium comprising machine-implementable instructions for, activities that can comprise and/or relate to: tracking movement of a gloved hand of a human; interpreting a gloved finger movement of the human; and/or in response to interpreting the gloved finger movement, providing feedback to the human

    Biosleeve Human-Machine Interface

    Get PDF
    Systems and methods for sensing human muscle action and gestures in order to control machines or robotic devices are disclosed. One exemplary system employs a tight fitting sleeve worn on a user arm and including a plurality of electromyography (EMG) sensors and at least one inertial measurement unit (IMU). Power, signal processing, and communications electronics may be built into the sleeve and control data may be transmitted wirelessly to the controlled machine or robotic device

    Pedestrian decision-making responses to external human-machine interface designs for autonomous vehicles

    Get PDF
    As part of a large UK-funded autonomous vehicle project (UK Autodrive), we examined pedestrian attitudes and road-crossing intentions using a real autonomous vehicle (AV) in an indoor arena. Two conceptual external human-machine interfaces (HMIs) were presented to display the vehicle's manoeuvring intentions. Participants experienced a simulated road-crossing task to assess their interactions with the AV. Although neither HMI concept was entirely free of criticism, there were objective performance differences for a projection-based HMI concept, as well as critical subjective opinions in pedestrian responses to specific manoeuvring contexts. These provided insight into pedestrians' safety concerns towards a vehicle where bi-directional communication with a driver is no longer possible, with suggestions for future vehicle HMI concepts

    Development of a Habitat Monitoring System for Simulated Mars Missions

    Get PDF
    The developers for the Habitat Monitoring System (HMS) for the Mobile Extreme Environment Research Station (MEERS) Mission Control System (MCS) faced challenges in the design and implementation of their human machine interface. Designing a human machine interface for a diverse group of end users presents technical and design challenges for the developers. By applying human factors concepts and following the engineering process, the development team was able to produce a human machine interface that met the product requirements and satisfied their product owners. This presentation discusses the process used by the development team to create the interface for the Habitat Monitoring System

    iDriver - Human Machine Interface for Autonomous Cars

    Get PDF
    Modern cars are equipped with a variety of sensors, advanced driver assistance systems and user interfaces nowadays. To benefit from these systems and to optimally support the driver in his monitoring and decision making process, efficient human-machine interfaces play an important part. This paper describes the second release of iDriver, an iPad software solution which was developed to navigate and remote control autonomous cars, to give access to live sensor data and useful data about the car state, as there are, e.g., current speed, engine and gear state. The software was used and evaluated in our two fully autonomous research cars “Spirit of Berlin” and “Made in Germany”

    The human eye as human-machine interface

    Get PDF
    Eye tracking as an interface to operate a computer is under research for a while and new systems are still being developed nowadays that provide some encouragement to those bound to illnesses that incapacitates them to use any other form of interaction with a computer. Although using computer vision processing and a camera, these systems are usually based on head mount technology being considered a contact type system. This paper describes the implementation of a human-computer interface based on a fully non-contact eye tracking vision system in order to allow people with tetraplegia to interface with a computer. As an assistive technology, a graphical user interface with special features was developed including a virtual keyboard to allow user communication, fast access to pre-stored phrases and multimedia and even internet browsing. This system was developed with the focus on low cost, user friendly functionality and user independency and autonomy.The authors would like to thank the important contributions of Mr. Abel, his wife and Mr. Sampaio for the success of this work. This work was supported by the Automation and Robotics Laboratory from the Algoritmi Research Center at the University of Minho in Guimaraes. This work is funded by FEDER through the Operational Competitiveness Programme — COMPETE — and by national funds through the Foundation for Science and Technology — FCT — in the scope of project: FCOMP-01-0124-FEDER-022674

    Human Machine Interface Programming and Testing

    Get PDF
    Human Machine Interface (HMI) Programming and Testing is about creating graphical displays to mimic mission critical ground control systems in order to provide NASA engineers with the ability to monitor the health management of these systems in real time. The Health Management System (HMS) is an online interactive human machine interface system that monitors all Kennedy Ground Control Subsystem (KGCS) hardware in the field. The Health Management System is essential to NASA engineers because it allows remote control and monitoring of the health management systems of all the Programmable Logic Controllers (PLC) and associated field devices. KGCS will have equipment installed at the launch pad, Vehicle Assembly Building, Mobile Launcher, as well as the Multi-Purpose Processing Facility. I am designing graphical displays to monitor and control new modules that will be integrated into the HMS. The design of the display screen will closely mimic the appearance and functionality of the actual modules. There are many different field devices used to monitor health management and each device has its own unique set of health management related data, therefore each display must also have its own unique way to display this data. Once the displays are created, the RSLogix5000 application is used to write software that maps all the required data read from the hardware to the graphical display. Once this data is mapped to its corresponding display item, the graphical display and hardware device will be connected through the same network in order to test all possible scenarios and types of data the graphical display was designed to receive. Test Procedures will be written to thoroughly test out the displays and ensure that they are working correctly before being deployed to the field. Additionally, the Kennedy Ground Controls Subsystem's user manual will be updated to explain to the NASA engineers how to use the new module displays

    Advanced technologies for Mission Control Centers

    Get PDF
    Advance technologies for Mission Control Centers are presented in the form of the viewgraphs. The following subject areas are covered: technology needs; current technology efforts at GSFC (human-machine interface development, object oriented software development, expert systems, knowledge-based software engineering environments, and high performance VLSI telemetry systems); and test beds
    • …
    corecore