929 research outputs found

    A Fabric-based Approach for Softness Rendering

    Get PDF
    In this chapter we describe a softness display based on the contact area spread rate (CASR) paradigm. This device uses a stretchable fabric as a substrate that can be touched by users, while contact area is directly measured via an optical system. By varying the stretching state of the fabric, different stiffness values can be conveyed to users. We describe a first technological implementation of the display and compare its performance in rendering various levels of stiffness with the one exhibited by a pneumatic CASR-based device. Psychophysical experiments are reported and discussed. Afterwards, we present a new technological implementation for the fabric-based display, with reduced dimensions and faster actuation, which enables rapid changes in the fabric stretching state. These changes are mandatory to properly track typical force/area curves of real materials. System performance in mimicking force-area curves obtained from real objects exhibits a high degree of reliability, also in eliciting overall discriminable levels of softness

    Robotic manipulators for single access surgery

    Get PDF
    This thesis explores the development of cooperative robotic manipulators for enhancing surgical precision and patient outcomes in single-access surgery and, specifically, Transanal Endoscopic Microsurgery (TEM). During these procedures, surgeons manipulate a heavy set of instruments via a mechanical clamp inserted in the patient’s body through a surgical port, resulting in imprecise movements, increased patient risks, and increased operating time. Therefore, an articulated robotic manipulator with passive joints is initially introduced, featuring built-in position and force sensors in each joint and electronic joint brakes for instant lock/release capability. The articulated manipulator concept is further improved with motorised joints, evolving into an active tool holder. The joints allow the incorporation of advanced robotic capabilities such as ultra-lightweight gravity compensation and hands-on kinematic reconfiguration, which can optimise the placement of the tool holder in the operating theatre. Due to the enhanced sensing capabilities, the application of the active robotic manipulator was further explored in conjunction with advanced image guidance approaches such as endomicroscopy. Recent advances in probe-based optical imaging such as confocal endomicroscopy is making inroads in clinical uses. However, the challenging manipulation of imaging probes hinders their practical adoption. Therefore, a combination of the fully cooperative robotic manipulator with a high-speed scanning endomicroscopy instrument is presented, simplifying the incorporation of optical biopsy techniques in routine surgical workflows. Finally, another embodiment of a cooperative robotic manipulator is presented as an input interface to control a highly-articulated robotic instrument for TEM. This master-slave interface alleviates the drawbacks of traditional master-slave devices, e.g., using clutching mechanics to compensate for the mismatch between slave and master workspaces, and the lack of intuitive manipulation feedback, e.g. joint limits, to the user. To address those drawbacks a joint-space robotic manipulator is proposed emulating the kinematic structure of the flexible robotic instrument under control.Open Acces

    B:Ionic Glove: A Soft Smart Wearable Sensory Feedback Device for Upper Limb Robotic Prostheses

    Get PDF
    Upper limb robotic prosthetic devices currently lack adequate sensory feedback, contributing to a high rejection rate. Incorporating affective sensory feedback into these devices reduces phantom limb pain and increases control and acceptance. To address the lack of sensory feedback we present the B:Ionic glove, wearable over a robotic hand which contains sensing, computation and actuation on board. It uses shape memory alloy (SMA) actuators integrated into an armband to gently squeeze the user's arm when pressure is sensed in novel electro-fluidic fingertip sensors and decoded through soft matter logic. We found that a circular electro-fluidic sensor cavity generated the most sensitive fingertip sensor and considered a computational configuration to convey different information from robot to user. A user study was conducted to characterise the tactile interaction capabilities of the device. No significant difference was found between the skin sensitivity threshold of participants' lower and upper arm. They found it easier to distinguish stimulation locations than strengths. Finally, we demonstrate a proof-of-concept of the complete device, illustrating how it could be used to grip an object, solely from the affective tactile feedback provided by the B:Ionic glove. The B:Ionic glove is a step towards the integration of natural, soft sensory feedback into robotic prosthetic devices.</p

    ESPRESS.0: Eustachian Tube-Inspired Tactile Sensor Exploiting Pneumatics for Range Extension and SenSitivity Tuning

    Get PDF
    Optimising the sensitivity of a tactile sensor to a specific range of stimuli magnitude usually compromises the sensor’s widespread usage. This paper presents a novel soft tactile sensor capable of dynamically tuning its stiffness for enhanced sensitivity across a range of applied forces, taking inspiration from the Eustachian tube in the mammalian ear. The sensor exploits an adjustable pneumatic back pressure to control the effective stiffness of its 20 mm diameter elastomer interface. An internally translocated fluid is coupled to the membrane and optically tracked to measure physical interactions at the interface. The sensor can be actuated by pneumatic pressure to dynamically adjust its stiffness. It is demonstrated to detect forces as small as 0.012 N, and to be sensitive to a difference of 0.006 N in the force range of 35 to 40 N. The sensor is demonstrated to be capable of detecting tactile cues on the surface of objects in the sub-millimetre scale. It is able to adapt its compliance to increase its ability for distinguishing between stimuli with similar stiffnesses (0.181 N/mm difference) over a large range (0.1 to 1.1 N/mm) from only a 0.6 mm deep palpation. The sensor is intended to interact comfortably with skin, and the feasibility of its use in palpating tissue in search of hard inclusions is demonstrated by locating and estimating the size of a synthetic hard node embedded 20 mm deep in a soft silicone sample. The results suggest that the sensor is a good candidate for tactile tasks involving unpredictable or unknown stimuli

    Haptic perception in virtual reality in sighted and blind individuals

    Get PDF
    The incorporation of the sense of touch into virtual reality is an exciting development. However, research into this topic is in its infancy. This experimental programme investigated both the perception of virtual object attributes by touch and the parameters that influence touch perception in virtual reality with a force feedback device called the PHANTOM (TM) (www.sensable.com). The thesis had three main foci. Firstly, it aimed to provide an experimental account of the perception of the attributes of roughness, size and angular extent by touch via the PHANTOM (TM) device. Secondly, it aimed to contribute to the resolution of a number of other issues important in developing an understanding of the parameters that exert an influence on touch in virtual reality. Finally, it aimed to compare touch in virtual reality between sighted and blind individuals. This thesis comprises six experiments. Experiment one examined the perception of the roughness of virtual textures with the PHANTOM (TM) device. The effect of the following factors was addressed: the groove width of the textured stimuli; the endpoint used (stylus or thimble) with the PHANTOM (TM); the specific device used (PHANTOM (TM) vs. IE3000) and the visual status (sighted or blind) of the participants. Experiment two extended the findings of experiment one by addressing the impact of an exploration related factor on perceived roughness, that of the contact force an individual applies to a virtual texture. The interaction between this variable and the factors of groove width, endpoint, and visual status was also addressed. Experiment three examined the perception of the size and angular extent of virtual 3-D objects via the PHANTOM (TM). With respect to the perception of virtual object size, the effect of the following factors was addressed: the size of the object (2.7,3.6,4.5 cm); the type of virtual object (cube vs. sphere); the mode in which the virtual objects were presented; the endpoint used with the PHANTOM (TM) and the visual status of the participants. With respect to the perception of virtual object angular extent, the effect of the following factors was addressed: the angular extent of the object (18,41 and 64°); the endpoint used with the PHANTOM (TM) and the visual status of the participants. Experiment four examined the perception of the size and angular extent of real counterparts to the virtual 3-D objects used in experiment three. Experiment four manipulated the conditions under which participants examined the real objects. Participants were asked to give judgements of object size and angular extent via the deactivated PHANTOM (TM), a stylus probe, a bare index finger and without any constraints on their exploration. In addition to the above exploration type factor, experiment four examined the impact of the same factors on perceived size and angular extent in the real world as had been examined in virtual reality. Experiments five and six examined the consistency of the perception of linear extent across the 3-D axes in virtual space. Both experiments manipulated the following factors: Line extent (2.7,3.6 and 4.5cm); line dimension (x, y and z axis); movement type (active vs. passive movement) and visual status. Experiment six additionally manipulated the direction of movement within the 3-D axes. Perceived roughness was assessed by the method of magnitude estimation. The perceived size and angular extent of the various virtual stimuli and their real counterparts was assessed by the method of magnitude reproduction. This technique was also used to assess perceived extent across the 3-D axes. Touch perception via the PHANTOM (TM) was found to be broadly similar for sighted and blind participants. Touch perception in virtual reality was also found to be broadly similar between two different 3-D force feedback devices (the PHANTOM (TM) and the IE3000). However, the endpoint used with the PHANTOM (TM) device was found to exert significant, but inconsistent effects on the perception of virtual object attributes. Touch perception with the PHANTOM (TM) across the 3-D axes was found to be anisotropic in a similar way to the real world, with the illusion that radial extents were perceived as longer than equivalent tangential extents. The perception of 3-D object size and angular extent was found to be comparable between virtual reality and the real world, particularly under conditions where the participants' exploration of the real objects was constrained to a single point of contact. An intriguing touch illusion, whereby virtual objects explored from the inside were perceived to be larger than the same objects perceived from the outside was found to occur widely in virtual reality, in addition to the real world. This thesis contributes to knowledge of touch perception in virtual reality. The findings have interesting implications for theories of touch perception, both virtual and real

    Haptics Rendering and Applications

    Get PDF
    There has been significant progress in haptic technologies but the incorporation of haptics into virtual environments is still in its infancy. A wide range of the new society's human activities including communication, education, art, entertainment, commerce and science would forever change if we learned how to capture, manipulate and reproduce haptic sensory stimuli that are nearly indistinguishable from reality. For the field to move forward, many commercial and technological barriers need to be overcome. By rendering how objects feel through haptic technology, we communicate information that might reflect a desire to speak a physically- based language that has never been explored before. Due to constant improvement in haptics technology and increasing levels of research into and development of haptics-related algorithms, protocols and devices, there is a belief that haptics technology has a promising future

    Feasibility and performance analysis in 3D printing of artworks using laser scanning microprofilometry

    Get PDF
    We investigated optical scanning microprofilometry and conoscopic holography sensors as nondestructive testing and evaluation tools in archeology for obtaining an accurate 3D printed reproduction of the data. The modular microprofilometer prototype allows a versatile acquisition of different materials and shapes producing a high-quality dataset that enables surface modelling at micrometric scales from which a "scientific" replica can be obtained through 3D printing technologies. As exemplar case study, an archeological amphora was acquired and 3D printed. In order to test the feasibility and the performance of the whole process chain from the acquisition to the reproduction, we propose a statistical multiscale analysis of the surface signal of object and replica based on metrological parameters. This approach allows to demonstrate that the accuracy of the 3D printing process preserves the range of spatial wavelengths that characterizes the surface features of interest within the technology capabilities. This work extends the usefulness of the replicas from museum exposition to scientific applications

    Characterising a novel interface for event-based haptic grasping

    Full text link

    Designing multi-sensory displays for abstract data

    Get PDF
    The rapid increase in available information has lead to many attempts to automatically locate patterns in large, abstract, multi-attributed information spaces. These techniques are often called data mining and have met with varying degrees of success. An alternative approach to automatic pattern detection is to keep the user in the exploration loop by developing displays for perceptual data mining. This approach allows a domain expert to search the data for useful relationships and can be effective when automated rules are hard to define. However, designing models of the abstract data and defining appropriate displays are critical tasks in building a useful system. Designing displays of abstract data is especially difficult when multi-sensory interaction is considered. New technology, such as Virtual Environments, enables such multi-sensory interaction. For example, interfaces can be designed that immerse the user in a 3D space and provide visual, auditory and haptic (tactile) feedback. It has been a goal of Virtual Environments to use multi-sensory interaction in an attempt to increase the human-to-computer bandwidth. This approach may assist the user to understand large information spaces and find patterns in them. However, while the motivation is simple enough, actually designing appropriate mappings between the abstract information and the human sensory channels is quite difficult. Designing intuitive multi-sensory displays of abstract data is complex and needs to carefully consider human perceptual capabilities, yet we interact with the real world everyday in a multi-sensory way. Metaphors can describe mappings between the natural world and an abstract information space. This thesis develops a division of the multi-sensory design space called the MS-Taxonomy. The MS-Taxonomy provides a concept map of the design space based on temporal, spatial and direct metaphors. The detailed concepts within the taxonomy allow for discussion of low level design issues. Furthermore the concepts abstract to higher levels, allowing general design issues to be compared and discussed across the different senses. The MS-Taxonomy provides a categorisation of multi-sensory design options. However, to design effective multi-sensory displays requires more than a thorough understanding of design options. It is also useful to have guidelines to follow, and a process to describe the design steps. This thesis uses the structure of the MS-Taxonomy to develop the MS-Guidelines and the MS-Process. The MS-Guidelines capture design recommendations and the problems associated with different design choices. The MS-Process integrates the MS-Guidelines into a methodology for developing and evaluating multi-sensory displays. A detailed case study is used to validate the MS-Taxonomy, the MS-Guidelines and the MS-Process. The case study explores the design of multi-sensory displays within a domain where users wish to explore abstract data for patterns. This area is called Technical Analysis and involves the interpretation of patterns in stock market data. Following the MS-Process and using the MS-Guidelines some new multi-sensory displays are designed for pattern detection in stock market data. The outcome from the case study includes some novel haptic-visual and auditory-visual designs that are prototyped and evaluated

    Investigation of the use of meshfree methods for haptic thermal management of design and simulation of MEMS

    Get PDF
    This thesis presents a novel approach of using haptic sensing technology combined with virtual environment (VE) for the thermal management of Micro-Electro-Mechanical-Systems (MEMS) design. The goal is to reduce the development cycle by avoiding the costly iterative prototyping procedure. In this regard, we use haptic feedback with virtua lprototyping along with an immersing environment. We also aim to improve the productivity and capability of the designer to better grasp the phenomena operating at the micro-scale level, as well as to augment computational steering through haptic channels. To validate the concept of haptic thermal management, we have implemented a demonstrator with a user friendly interface which allows to intuitively "feel" the temperature field through our concept of haptic texturing. The temperature field in a simple MEMS component is modeled using finite element methods (FEM) or finite difference method (FDM) and the user is able to feel thermal expansion using a combination of different haptic feedback. In haptic application, the force rendering loop needs to be updated at a frequency of 1Khz in order to maintain continuity in the user perception. When using FEM or FDM for our three-dimensional model, the computational cost increases rapidly as the mesh size is reduced to ensure accuracy. Hence, it constrains the complexity of the physical model to approximate temperature or stress field solution. It would also be difficult to generate or refine the mesh in real time for CAD process. In order to circumvent the limitations due to the use of conventional mesh-based techniques and to avoid the bothersome task of generating and refining the mesh, we investigate the potential of meshfree methods in the context of our haptic application. We review and compare the different meshfree formulations against FEM mesh based technique. We have implemented the different methods for benchmarking thermal conduction and elastic problems. The main work of this thesis is to determine the relevance of the meshfree option in terms of flexibility of design and computational charge for haptic physical model
    corecore