79 research outputs found

    Design of a six degree-of-freedom haptic hybrid platform manipultor

    Get PDF
    Thesis (Master)--Izmir Institute of Technology, Mechanical Engineering, Izmir, 2010Includes bibliographical references (leaves: 97-103)Text in English; Abstract: Turkish and Englishxv, 115 leavesThe word Haptic, based on an ancient Greek word called haptios, means related with touch. As an area of robotics, haptics technology provides the sense of touch for robotic applications that involve interaction with human operator and the environment. The sense of touch accompanied with the visual feedback is enough to gather most of the information about a certain environment. It increases the precision of teleoperation and sensation levels of the virtual reality (VR) applications by exerting physical properties of the environment such as forces, motions, textures. Currently, haptic devices find use in many VR and teleoperation applications. The objective of this thesis is to design a novel Six Degree-of-Freedom (DOF) haptic desktop device with a new structure that has the potential to increase the precision in the haptics technology. First, previously developed haptic devices and manipulator structures are reviewed. Following this, the conceptual designs are formed and a hybrid structured haptic device is designed manufactured and tested. Developed haptic device.s control algorithm and VR application is developed in Matlab© Simulink. Integration of the mechanism with mechanical, electromechanical and electronic components and the initial tests of the system are executed and the results are presented. According to the results, performance of the developed device is discussed and future works are addressed

    A Soft touch: wearable dielectric elastomer actuated multi-finger soft tactile displays

    Get PDF
    PhDThe haptic modality in human-computer interfaces is significantly underutilised when compared to that of vision and sound. A potential reason for this is the difficulty in turning computer-generated signals into realistic sensations of touch. Moreover, wearable solutions that can be mounted onto multiple fingertips whilst still allowing for the free dexterous movements of the user’s hand, brings an even higher level of complexity. In order to be wearable, such devices should not only be compact, lightweight and energy efficient; but also, be able to render compelling tactile sensations. Current solutions are unable to meet these criteria, typically due to the actuation mechanisms employed. Aimed at addressing these needs, this work presents research into non-vibratory multi-finger wearable tactile displays, through the use of an improved configuration of a dielectric elastomer actuator. The described displays render forces through a soft bubble-like interface worn on the fingertip. Due to the improved design, forces of up to 1N can be generated in a form factor of 20 x 12 x 23 mm, with a weight of only 6g, demonstrating a significant performance increase in force output and wearability over existing tactile rendering systems. Furthermore, it is shown how these compact wearable devices can be used in conjunction with low-cost commercial optical hand tracking sensors, to cater for simple although accurate tactile interactions within virtual environments, using affordable instrumentation. The whole system makes it possible for users to interact with virtually generated soft body objects with programmable tactile properties. Through a 15-participant study, the system has been validated for three distinct types of touch interaction, including palpation and pinching of virtual deformable objects. Through this investigation, it is believed that this approach could have a significant impact within virtual and augmented reality interaction for purposes of medical simulation, professional training and improved tactile feedback in telerobotic control systems.Engineering and Physical Sciences Research Council (EPSRC) Doctoral Training Centre EP/G03723X/

    Remotely operated telepresent robotics

    Get PDF
    Remotely operated robots with the ability of performing specific tasks are often used in hazardous environments in place of humans to prevent injury or death. Modern remotely operated robots suffer from limitations with accuracy which is primarily due the lack of depth perception and unintuitive hardware controls. The undertaken research project suggests an alternative method of vision and control to increase a user‟s operational performance of remotely controlled robotics. The Oculus Rift Development Kit 2.0 is a low cost device originally developed for the electronic entertainment industry which allows users to experience virtual reality by the use of a head mounted display. This technology is able to be adapted to different uses and is primarily utilised to achieve real world stereoscopic 3D vision for the user. Additionally a wearable controller was trialled with the goal of allowing a robotic arm to mimic the position of the user‟s arm via a master/slave setup. By incorporating the stated vision and control methods, any possible improvements in the accuracy and speed for users was investigated through experimentation and a conducted study. Results indicated that using the Oculus Rift for stereoscopic vision improved upon the user‟s ability to judge distances remotely but was detrimental to the user‟s ability to operate the robot. The research has been conducted under the supervision of the University of Southern Queensland (USQ) and provides useful information towards the area of remotely operated telepresent robotics

    Novel Actuation Methods for High Force Haptics

    Get PDF

    A Human-Embodied Drone for Dexterous Aerial Manipulation

    Full text link
    Current drones perform a wide variety of tasks in surveillance, photography, agriculture, package delivery, etc. However, these tasks are performed passively without the use of human interaction. Aerial manipulation shifts this paradigm and implements drones with robotic arms that allow interaction with the environment rather than simply sensing it. For example, in construction, aerial manipulation in conjunction with human interaction could allow operators to perform several tasks, such as hosing decks, drill into surfaces, and sealing cracks via a drone. This integration with drones will henceforth be known as dexterous aerial manipulation. Our recent work integrated the worker’s experience into aerial manipulation using haptic technology. The net effect was such a system could enable the worker to leverage drones and complete tasks while utilizing haptics on the task site remotely. However, the tasks were completed within the operator’s line-of-sight. Until now, immersive AR/VR frameworks has rarely been integrated in aerial manipulation. Yet, such a framework allows the drones to embody and transport the operator’s senses, actions, and presence to a remote location in real-time. As a result, the operator can both physically interact with the environment and socially interact with actual workers on the worksite. This dissertation presents a human-embodied drone interface for dexterous aerial manipulation. Using VR/AR technology, the interface allows the operator to leverage their intelligence to collaboratively perform desired tasks anytime, anywhere with a drone that possesses great dexterity

    Biomechatronics: Harmonizing Mechatronic Systems with Human Beings

    Get PDF
    This eBook provides a comprehensive treatise on modern biomechatronic systems centred around human applications. A particular emphasis is given to exoskeleton designs for assistance and training with advanced interfaces in human-machine interaction. Some of these designs are validated with experimental results which the reader will find very informative as building-blocks for designing such systems. This eBook will be ideally suited to those researching in biomechatronic area with bio-feedback applications or those who are involved in high-end research on manmachine interfaces. This may also serve as a textbook for biomechatronic design at post-graduate level

    A proposal to improve wearables development time and performance : software and hardware approaches.

    Get PDF
    Programa de P?s-Gradua??o em Ci?ncia da Computa??o. Departamento de Ci?ncia da Computa??o, Instituto de Ci?ncias Exatas e Biol?gicas, Universidade Federal de Ouro Preto.Wearable devices are a trending topic in both commercial and academic areas. Increasing demand for innovation has raised the number of research and products, addressing brandnew challenges, and creating profitable opportunities. Current wearable devices can be employed in solving problems in a wide variety of areas. Such coverage generates a relevant number of requirements and variables that influences solutions performance. It is common to build specific wearable versions to fit each targeting application niche, what requires time and resources. Currently, the related literature does not present ways to treat the hardware/software in a generic way enough to allow both parts reuse. This manuscript presents the proposal of two components focused on hardware/software, respectively, allowing the reuse of di?erent parts of a wearable solution. A platform for wearables development is outlined as a viable way to recycle an existing organization and architecture. The platform use was proven through the creation of a wearable device that was enabled to be used in di?erent contexts of the mining industry. In the software side, a development and customization tool for specific operating systems is demonstrated. This tool aims not only to reuse standard software components but also to provide improved performance simultaneously. A real prototype was designed and created as a manner to validate the concepts. In the results, the comparison between the operating system generated by the tool versus a conventional operating system allows quantifying the improvement rate. The former operating system showed approximate performance gains of 100% in processing tasks, 150% in memory consumption and I/O operations, and approximately 20% of reduction in energy consumption. In the end, performance analysis allows inferring that the proposals presented here contribute to this area, easing the development and reuse of wearable solutions as a whole

    Touching on elements for a non-invasive sensory feedback system for use in a prosthetic hand

    Get PDF
    Hand amputation results in the loss of motor and sensory functions, impacting activities of daily life and quality of life. Commercially available prosthetic hands restore the motor function but lack sensory feedback, which is crucial to receive information about the prosthesis state in real-time when interacting with the external environment. As a supplement to the missing sensory feedback, the amputee needs to rely on visual and audio cues to operate the prosthetic hand, which can be mentally demanding. This thesis revolves around finding potential solutions to contribute to an intuitive non-invasive sensory feedback system that could be cognitively less burdensome and enhance the sense of embodiment (the feeling that an artificial limb belongs to one’s own body), increasing acceptance of wearing a prosthesis.A sensory feedback system contains sensors to detect signals applied to the prosthetics. The signals are encoded via signal processing to resemble the detected sensation delivered by actuators on the skin. There is a challenge in implementing commercial sensors in a prosthetic finger. Due to the prosthetic finger’s curvature and the fact that some prosthetic hands use a covering rubber glove, the sensor response would be inaccurate. This thesis shows that a pneumatic touch sensor integrated into a rubber glove eliminates these errors. This sensor provides a consistent reading independent of the incident angle of stimulus, has a sensitivity of 0.82 kPa/N, a hysteresis error of 2.39±0.17%, and a linearity error of 2.95±0.40%.For intuitive tactile stimulation, it has been suggested that the feedback stimulus should be modality-matched with the intention to provide a sensation that can be easily associated with the real touch on the prosthetic hand, e.g., pressure on the prosthetic finger should provide pressure on the residual limb. A stimulus should also be spatially matched (e.g., position, size, and shape). Electrotactile stimulation has the ability to provide various sensations due to it having several adjustable parameters. Therefore, this type of stimulus is a good candidate for discrimination of textures. A microphone can detect texture-elicited vibrations to be processed, and by varying, e.g., the median frequency of the electrical stimulation, the signal can be presented on the skin. Participants in a study using electrotactile feedback showed a median accuracy of 85% in differentiating between four textures.During active exploration, electrotactile and vibrotactile feedback provide spatially matched modality stimulations, providing continuous feedback and providing a displaced sensation or a sensation dispatched on a larger area. Evaluating commonly used stimulation modalities using the Rubber Hand Illusion, modalities which resemble the intended sensation provide a more vivid illusion of ownership for the rubber hand.For a potentially more intuitive sensory feedback, the stimulation can be somatotopically matched, where the stimulus is experienced as being applied on a site corresponding to their missing hand. This is possible for amputees who experience referred sensation on their residual stump. However, not all amputees experience referred sensations. Nonetheless, after a structured training period, it is possible to learn to associate touch with specific fingers, and the effect persisted after two weeks. This effect was evaluated on participants with intact limbs, so it remains to evaluate this effect for amputees.In conclusion, this thesis proposes suggestions on sensory feedback systems that could be helpful in future prosthetic hands to (1) reduce their complexity and (2) enhance the sense of body ownership to enhance the overall sense of embodiment as an addition to an intuitive control system
    • …
    corecore