918 research outputs found

    Gloved Human-Machine Interface

    Get PDF
    Certain exemplary embodiments can provide a system, machine, device, manufacture, circuit, composition of matter, and/or user interface adapted for and/or resulting from, and/or a method and/or machine-readable medium comprising machine-implementable instructions for, activities that can comprise and/or relate to: tracking movement of a gloved hand of a human; interpreting a gloved finger movement of the human; and/or in response to interpreting the gloved finger movement, providing feedback to the human

    Augmented reality device for first response scenarios

    Get PDF
    A prototype of a wearable computer system is proposed and implemented using commercial off-shelf components. The system is designed to allow the user to access location-specific information about an environment, and to provide capability for user tracking. Areas of applicability include primarily first response scenarios, with possible applications in maintenance or construction of buildings and other structures. Necessary preparation of the target environment prior to system\u27s deployment is limited to noninvasive labeling using optical fiducial markers. The system relies on computational vision methods for registration of labels and user position. With the system the user has access to on-demand information relevant to a particular real-world location. Team collaboration is assisted by user tracking and real-time visualizations of team member positions within the environment. The user interface and display methods are inspired by Augmented Reality1 (AR) techniques, incorporating a video-see-through Head Mounted Display (HMD) and fingerbending sensor glove.*. 1Augmented reality (AR) is a field of computer research which deals with the combination of real world and computer generated data. At present, most AR research is concerned with the use of live video imagery which is digitally processed and augmented by the addition of computer generated graphics. Advanced research includes the use of motion tracking data, fiducial marker recognition using machine vision, and the construction of controlled environments containing any number of sensors and actuators. (Source: Wikipedia) *This dissertation is a compound document (contains both a paper copy and a CD as part of the dissertation). The CD requires the following system requirements: Adobe Acrobat; Microsoft Office; Windows MediaPlayer or RealPlayer

    Servomotor-Linked Articulated Versatile End Effector (SLAVE2)

    Get PDF
    A strategy is presented for the design and construction of a large master/slave-controlled, five-finger robotic hand. Each of the five fingers will possess four independent axes each driven by a brushless DC servomotor and, thus, four degrees-of-freedom. It is proposed that commercially available components be utilized as much as possible to fabricate a working laboratory model of the device with an anticipated overall length of approximately three feet (0.9 m). The fingers are to be designed to accommodate proximity, tactile, or force/torque sensors imbedded in their structure. In order to provide for the simultaneous control of the operator wears a specially instrumented glove which produces control signals corresponding to the finger configuration and which is capable of conveying sensor feedback signals to the operator. Two dexterous hand master devices are currently commercially available for this application with both undergoing continuing development

    a human in the loop cyber physical system for collaborative assembly in smart manufacturing

    Get PDF
    Abstract Industry 4.0 rose with the introduction of cyber-physical systems (CPS) and Internet of things (IoT) inside manufacturing systems. CPS represent self-controlled physical processes, having tight networking capabilities and efficient interfaces for human interaction. The interactive dimension of CPS reaches its maximum when defined in terms of natural human-machine interfaces (NHMI), i.e., those reducing the technological barriers required for the interaction. This paper presents a NHMI bringing the human decision-making capabilities inside the cybernetic control loop of a smart manufacturing assembly system. The interface allows to control, coordinate and cooperate with an industrial cobot during the task execution

    The "Federica" hand: a simple, very efficient prothesis

    Get PDF
    Hand prostheses partially restore hand appearance and functionalities. Not everyone can afford expensive prostheses and many low-cost prostheses have been proposed. In particular, 3D printers have provided great opportunities by simplifying the manufacturing process and reducing costs. Generally, active prostheses use multiple motors for fingers movement and are controlled by electromyographic (EMG) signals. The "Federica" hand is a single motor prosthesis, equipped with an adaptive grasp and controlled by a force-myographic signal. The "Federica" hand is 3D printed and has an anthropomorphic morphology with five fingers, each consisting of three phalanges. The movement generated by a single servomotor is transmitted to the fingers by inextensible tendons that form a closed chain; practically, no springs are used for passive hand opening. A differential mechanical system simultaneously distributes the motor force in predefined portions on each finger, regardless of their actual positions. Proportional control of hand closure is achieved by measuring the contraction of residual limb muscles by means of a force sensor, replacing the EMG. The electrical current of the servomotor is monitored to provide the user with a sensory feedback of the grip force, through a small vibration motor. A simple Arduino board was adopted as processing unit. The differential mechanism guarantees an efficient transfer of mechanical energy from the motor to the fingers and a secure grasp of any object, regardless of its shape and deformability. The force sensor, being extremely thin, can be easily embedded into the prosthesis socket and positioned on both muscles and tendons; it offers some advantages over the EMG as it does not require any electrical contact or signal processing to extract information about the muscle contraction intensity. The grip speed is high enough to allow the user to grab objects on the fly: from the muscle trigger until to the complete hand closure, "Federica" takes about half a second. The cost of the device is about 100 US$. Preliminary tests carried out on a patient with transcarpal amputation, showed high performances in controlling the prosthesis, after a very rapid training session. The "Federica" hand turned out to be a lightweight, low-cost and extremely efficient prosthesis. The project is intended to be open-source: all the information needed to produce the prosthesis (e.g. CAD files, circuit schematics, software) can be downloaded from a public repository. Thus, allowing everyone to use the "Federica" hand and customize or improve it

    Interaction Methods for Smart Glasses : A Survey

    Get PDF
    Since the launch of Google Glass in 2014, smart glasses have mainly been designed to support micro-interactions. The ultimate goal for them to become an augmented reality interface has not yet been attained due to an encumbrance of controls. Augmented reality involves superimposing interactive computer graphics images onto physical objects in the real world. This survey reviews current research issues in the area of human-computer interaction for smart glasses. The survey first studies the smart glasses available in the market and afterwards investigates the interaction methods proposed in the wide body of literature. The interaction methods can be classified into hand-held, touch, and touchless input. This paper mainly focuses on the touch and touchless input. Touch input can be further divided into on-device and on-body, while touchless input can be classified into hands-free and freehand. Next, we summarize the existing research efforts and trends, in which touch and touchless input are evaluated by a total of eight interaction goals. Finally, we discuss several key design challenges and the possibility of multi-modal input for smart glasses.Peer reviewe

    Real-time Immersive human-computer interaction based on tracking and recognition of dynamic hand gestures

    Get PDF
    With fast developing and ever growing use of computer based technologies, human-computer interaction (HCI) plays an increasingly pivotal role. In virtual reality (VR), HCI technologies provide not only a better understanding of three-dimensional shapes and spaces, but also sensory immersion and physical interaction. With the hand based HCI being a key HCI modality for object manipulation and gesture based communication, challenges are presented to provide users a natural, intuitive, effortless, precise, and real-time method for HCI based on dynamic hand gestures, due to the complexity of hand postures formed by multiple joints with high degrees-of-freedom, the speed of hand movements with highly variable trajectories and rapid direction changes, and the precision required for interaction between hands and objects in the virtual world. Presented in this thesis is the design and development of a novel real-time HCI system based on a unique combination of a pair of data gloves based on fibre-optic curvature sensors to acquire finger joint angles, a hybrid tracking system based on inertia and ultrasound to capture hand position and orientation, and a stereoscopic display system to provide an immersive visual feedback. The potential and effectiveness of the proposed system is demonstrated through a number of applications, namely, hand gesture based virtual object manipulation and visualisation, hand gesture based direct sign writing, and hand gesture based finger spelling. For virtual object manipulation and visualisation, the system is shown to allow a user to select, translate, rotate, scale, release and visualise virtual objects (presented using graphics and volume data) in three-dimensional space using natural hand gestures in real-time. For direct sign writing, the system is shown to be able to display immediately the corresponding SignWriting symbols signed by a user using three different signing sequences and a range of complex hand gestures, which consist of various combinations of hand postures (with each finger open, half-bent, closed, adduction and abduction), eight hand orientations in horizontal/vertical plans, three palm facing directions, and various hand movements (which can have eight directions in horizontal/vertical plans, and can be repetitive, straight/curve, clockwise/anti-clockwise). The development includes a special visual interface to give not only a stereoscopic view of hand gestures and movements, but also a structured visual feedback for each stage of the signing sequence. An excellent basis is therefore formed to develop a full HCI based on all human gestures by integrating the proposed system with facial expression and body posture recognition methods. Furthermore, for finger spelling, the system is shown to be able to recognise five vowels signed by two hands using the British Sign Language in real-time

    Addressing the problem of Interaction in fully immersive Virtual Environments: from raw sensor data to effective devices

    Get PDF
    Immersion into Virtual Reality is a perception of being physically present in a non-physical world. The perception is created by surrounding the user of the VR system with images, sound or other stimuli that provide an engrossing total environment. The use of technological devices such as stereoscopic cameras, head-mounted displays, tracking systems and haptic interfaces allows for user experiences providing a physical feeling of being in a realistic world, and the term “immersion” is a metaphoric use of the experience of submersion applied to representation, fiction or simulation. One of the main peculiarity of fully immersive virtual reality is the enhancing of the simple passive viewing of a virtual environment with the ability to manipulate virtual objects inside it. This Thesis project investigates such interfaces and metaphors for the interaction and the manipulation tasks. In particular, the research activity conducted allowed the design of a thimble-like interface that can be used to recognize in real-time the human hand’s orientation and infer a simplified but effective model of the relative hand’s motion and gesture. Inside the virtual environment, users provided with the developed systems will be therefore able to operate with natural hand gestures in order to interact with the scene; for example, they could perform positioning task by moving, rotating and resizing existent objects, or create new ones from scratch. This approach is particularly suitable when there is the need for the user to operate in a natural way, performing smooth and precise movements. Possible applications of the system to the industry are the immersive design in which the user can perform Computer- Aided Design (CAD) totally immersed in a virtual environment, and the operators training, in which the user can be trained on a 3D model in assembling or disassembling complex mechanical machineries, following predefined sequences. The thesis has been organized around the following project plan: - Collection of the relevant State Of The Art - Evaluation of design choices and alternatives for the interaction hardware - Development of the necessary embedded firmware - Integration of the resulting devices in a complex interaction test-bed - Development of demonstrative applications implementing the device - Implementation of advanced haptic feedbac
    • …
    corecore