440 research outputs found

    Contactless Haptic Display Through Magnetic Field Control

    Full text link
    Haptic rendering enables people to touch, perceive, and manipulate virtual objects in a virtual environment. Using six cascaded identical hollow disk electromagnets and a small permanent magnet attached to an operator's finger, this paper proposes and develops an untethered haptic interface through magnetic field control. The concentric hole inside the six cascaded electromagnets provides the workspace, where the 3D position of the permanent magnet is tracked with a Microsoft Kinect sensor. The driving currents of six cascaded electromagnets are calculated in real-time for generating the desired magnetic force. Offline data from an FEA (finite element analysis) based simulation, determines the relationship between the magnetic force, the driving currents, and the position of the permanent magnet. A set of experiments including the virtual object recognition experiment, the virtual surface identification experiment, and the user perception evaluation experiment were conducted to demonstrate the proposed system, where Microsoft HoloLens holographic glasses are used for visual rendering. The proposed magnetic haptic display leads to an untethered and non-contact interface for natural haptic rendering applications, which overcomes the constraints of mechanical linkages in tool-based traditional haptic devices

    Robotically assisted eye surgery : a haptic master console

    Get PDF
    Vitreo-retinal surgery encompasses the surgical procedures performed on the vitreous humor and the retina. A procedure typically consists of the removal of the vitreous humor, the peeling of a membrane and/or the repair of a retinal detachment. Operations are performed with needle shaped instruments which enter the eye through surgeon made scleral openings. An instrument is moved by hand in four degrees of freedom (three rotations and one translation) through this opening. Two rotations (? and ? ) are for a lateral instrument tip movement. The other two DoFs (z and ?) are the translation and rotation along the instrument axis. Actuation of for example a forceps can be considered as a fifth DoF. Characteristically, the manipulation of delicate, micrometer range thick intraocular tissue is required. Today, eye surgery is performed with a maximum of two instruments simultaneously. The surgeon relies on visual feedback only, since instrument forces are below the human detection limit. A microscope provides the visual feedback. It forces the surgeon to work in a static and non ergonomic body posture. Although the surgeon’s proficiency improves throughout his career, hand tremor may become a problem around his mid-fifties. Robotically assisted surgery with a master-slave system enhances dexterity. The slave with instrument manipulators is placed over the eye. The surgeon controls the instrument manipulators via haptic interfaces at the master. The master and slave are connected by electronic hardware and control software. Implementation of tremor filtering in the control software and downscaling of the hand motion allow prolongation of the surgeon’s career. Furthermore, it becomes possible to do tasks like intraocular cannulation which can not be done by manually performed surgery. This thesis focusses on the master console. Eye surgery procedures are observed in the operating room of different hospitals to gain insight in the requirements for the master. The master console as designed has an adjustable frame, a 3D display and two haptic interfaces with a coarse adjustment arm each. The console is mounted at the head of the operating table and is combined with the slave. It is compact, easy to place and allows the surgeon to have a direct view on and a physical contact with the patient. Furthermore, it fits in today’s manual surgery arrangement. Each haptic interface has the same five degrees of freedom as the instrument inside the eye. Through these interfaces, the surgeon can feel the augmented instrument forces. Downscaling of the hand motion results in a more accurate instrument movement compared to manually performed surgery. Together with the visual feedback, it is like the surgeon grasps the instrument near the tip inside the eye. The similarity between hand motion and motion of the instrument tip as seen on the display results in an intuitive manipulation. Pre-adjustment of the interface is done via the coarse adjustment arm. Mode switching enables to control three or more instruments manipulators with only two interfaces. Two one degree of freedom master-slave systems with force feedback are built to derive the requirements for the haptic interface. Hardware in the loop testing provides valuable insights and shows the possibility of force feedback without the use of force sensors. Two five DoF haptic interfaces are realized for bimanual operation. Each DoF has a position encoder and a force feedback motor. A correct representation of the upscaled instrument forces is only possible if the disturbance forces are low. Actuators are therefore mounted to the fixed world or in the neighborhood of the pivoting point for a low contribution to the inertia. The use of direct drive for ' and and low geared, backdriveable transmissions for the other three DoFs gives a minimum of friction. Disturbance forces are further minimized by a proper cable layout and actuator-amplifier combinations without torque ripple. The similarity in DoFs between vitreo-retinal eye surgery and minimally invasive surgery (MIS) enables the system to be used for MIS as well. Experiments in combination with a slave robot for laparoscopic and thoracoscopic surgery show that an instrument can be manipulated in a comfortable and intuitive way. User experience of surgeons and others is utilized to improve the haptic interface further. A parallel instead of a serial actuation concept for the ' and DoFs reduces the inertia, eliminates the flexible cable connection between frame and motor and allows that the heat of the motor is transferred directly to the frame. A newly designed z-?? module combines the actuation and suspension of the hand held part of the interface and has a three times larger z range than in the first design of the haptic interface

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Doctor of Philosophy

    Get PDF
    dissertationVirtual reality is becoming a common technology with applications in fields such as medical training, product development, and entertainment. Providing haptic (sense of touch) information along with visual and audio information can create an immersive vi

    A Soft touch: wearable dielectric elastomer actuated multi-finger soft tactile displays

    Get PDF
    PhDThe haptic modality in human-computer interfaces is significantly underutilised when compared to that of vision and sound. A potential reason for this is the difficulty in turning computer-generated signals into realistic sensations of touch. Moreover, wearable solutions that can be mounted onto multiple fingertips whilst still allowing for the free dexterous movements of the user’s hand, brings an even higher level of complexity. In order to be wearable, such devices should not only be compact, lightweight and energy efficient; but also, be able to render compelling tactile sensations. Current solutions are unable to meet these criteria, typically due to the actuation mechanisms employed. Aimed at addressing these needs, this work presents research into non-vibratory multi-finger wearable tactile displays, through the use of an improved configuration of a dielectric elastomer actuator. The described displays render forces through a soft bubble-like interface worn on the fingertip. Due to the improved design, forces of up to 1N can be generated in a form factor of 20 x 12 x 23 mm, with a weight of only 6g, demonstrating a significant performance increase in force output and wearability over existing tactile rendering systems. Furthermore, it is shown how these compact wearable devices can be used in conjunction with low-cost commercial optical hand tracking sensors, to cater for simple although accurate tactile interactions within virtual environments, using affordable instrumentation. The whole system makes it possible for users to interact with virtually generated soft body objects with programmable tactile properties. Through a 15-participant study, the system has been validated for three distinct types of touch interaction, including palpation and pinching of virtual deformable objects. Through this investigation, it is believed that this approach could have a significant impact within virtual and augmented reality interaction for purposes of medical simulation, professional training and improved tactile feedback in telerobotic control systems.Engineering and Physical Sciences Research Council (EPSRC) Doctoral Training Centre EP/G03723X/

    inFORM: Dynamic Physical Affordances and Constraints through Shape and Object Actuation

    Get PDF
    Past research on shape displays has primarily focused on rendering content and user interface elements through shape output, with less emphasis on dynamically changing UIs. We propose utilizing shape displays in three different ways to mediate interaction: to facilitate by providing dynamic physical affordances through shape change, to restrict by guiding users with dynamic physical constraints, and to manipulate by actuating physical objects. We outline potential interaction techniques and introduce Dynamic Physical Affordances and Constraints with our inFORM system, built on top of a state-of-the-art shape display, which provides for variable stiffness rendering and real-time user input through direct touch and tangible interaction. A set of motivating examples demonstrates how dynamic affordances, constraints and object actuation can create novel interaction possibilities.National Science Foundation (U.S.). Graduate Research Fellowship (Grant 1122374)Swedish Research Council (Fellowship)Blanceflor Foundation (Scholarship

    Development and Evaluation of a Learning-based Model for Real-time Haptic Texture Rendering

    Full text link
    Current Virtual Reality (VR) environments lack the rich haptic signals that humans experience during real-life interactions, such as the sensation of texture during lateral movement on a surface. Adding realistic haptic textures to VR environments requires a model that generalizes to variations of a user's interaction and to the wide variety of existing textures in the world. Current methodologies for haptic texture rendering exist, but they usually develop one model per texture, resulting in low scalability. We present a deep learning-based action-conditional model for haptic texture rendering and evaluate its perceptual performance in rendering realistic texture vibrations through a multi part human user study. This model is unified over all materials and uses data from a vision-based tactile sensor (GelSight) to render the appropriate surface conditioned on the user's action in real time. For rendering texture, we use a high-bandwidth vibrotactile transducer attached to a 3D Systems Touch device. The result of our user study shows that our learning-based method creates high-frequency texture renderings with comparable or better quality than state-of-the-art methods without the need for learning a separate model per texture. Furthermore, we show that the method is capable of rendering previously unseen textures using a single GelSight image of their surface.Comment: 10 pages, 8 figure

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Design and Implementation of an Interactive Surface System with Controllable Shape and Softness

    Get PDF
    「平面的で硬い」という従来のディスプレイの物理的制約は、ユーザが3次元的な形状を有するデータを扱う場合や触覚的な情報を有するデータと対話する場合に様々な制限を与えている. また, 平面的なディスプレイ上で複雑な立体形状を閲覧・モデリングするためには, 頻繁な視点移動や複雑な頂点操作等を伴うGUI操作が必要である. このような問題を解決するため, 砂, 粘土のような非平面的・柔軟な素材をサーフェスに取り入れて, 従来のディスプレイにできない異なるインタラクションを可能にした研究が行われていたが, 一つのデバイスで異なる物理性質を表現できるディスプレイはあまり研究されていない.本研究は細かなパーティクルと気圧操作による硬さ制御技術に着目し, 硬度可変ディスプレイの実装を行った. 硬さ制御によって, 軟らかいときに形状の変形や, 用途に応じて形状を維持することもできる.このディスプレイの可能性を探るため, 硬さ制御を利用したモデリングアプリケーションを開発した. このアプリケーションでは, モデリング操作に応じて, 適切な硬さを選択する事ができ, モデルが完成した時にディスプレイを硬化し形状を維持させることが可能である.また, 深度カメラを用いることで, タッチ入力による彩色が可能になり, 作成したモデルをスキャンし, CADデータとして保存することもできる. さらに, 3Dプリンターで出力することも可能にした.このシステムは、従来のモデリング操作をより直感的する事ができるが, システム単独で形状を生成することができない. そこで, 本研究では粒子運搬技術を用いて, ディスプレイの形状アクチュエーション手法も提案する. この手法では, モデルの大まかな形状を生成することで, ユーザは形状の細部を自由にカスタマイズすることができる. この手法は, 硬さ制御技術と同じくパーティクルと空気アクチュエーションを用いているため, 低コストかつシンプルなシステムで実現することができる.電気通信大学201

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications
    corecore