628 research outputs found

    Digital fabrication of custom interactive objects with rich materials

    Get PDF
    As ubiquitous computing is becoming reality, people interact with an increasing number of computer interfaces embedded in physical objects. Today, interaction with those objects largely relies on integrated touchscreens. In contrast, humans are capable of rich interaction with physical objects and their materials through sensory feedback and dexterous manipulation skills. However, developing physical user interfaces that offer versatile interaction and leverage these capabilities is challenging. It requires novel technologies for prototyping interfaces with custom interactivity that support rich materials of everyday objects. Moreover, such technologies need to be accessible to empower a wide audience of researchers, makers, and users. This thesis investigates digital fabrication as a key technology to address these challenges. It contributes four novel design and fabrication approaches for interactive objects with rich materials. The contributions enable easy, accessible, and versatile design and fabrication of interactive objects with custom stretchability, input and output on complex geometries and diverse materials, tactile output on 3D-object geometries, and capabilities of changing their shape and material properties. Together, the contributions of this thesis advance the fields of digital fabrication, rapid prototyping, and ubiquitous computing towards the bigger goal of exploring interactive objects with rich materials as a new generation of physical interfaces.Computer werden zunehmend in GerĂ€ten integriert, mit welchen Menschen im Alltag interagieren. Heutzutage basiert diese Interaktion weitgehend auf Touchscreens. Im Kontrast dazu steht die reichhaltige Interaktion mit physischen Objekten und Materialien durch sensorisches Feedback und geschickte Manipulation. Interfaces zu entwerfen, die diese FĂ€higkeiten nutzen, ist allerdings problematisch. HierfĂŒr sind Technologien zum Prototyping neuer Interfaces mit benutzerdefinierter InteraktivitĂ€t und KompatibilitĂ€t mit vielfĂ€ltigen Materialien erforderlich. Zudem sollten solche Technologien zugĂ€nglich sein, um ein breites Publikum zu erreichen. Diese Dissertation erforscht die digitale Fabrikation als SchlĂŒsseltechnologie, um diese Probleme zu adressieren. Sie trĂ€gt vier neue Design- und FabrikationsansĂ€tze fĂŒr das Prototyping interaktiver Objekte mit reichhaltigen Materialien bei. Diese ermöglichen einfaches, zugĂ€ngliches und vielseitiges Design und Fabrikation von interaktiven Objekten mit individueller Dehnbarkeit, Ein- und Ausgabe auf komplexen Geometrien und vielfĂ€ltigen Materialien, taktiler Ausgabe auf 3D-Objektgeometrien und der FĂ€higkeit ihre Form und Materialeigenschaften zu Ă€ndern. Insgesamt trĂ€gt diese Dissertation zum Fortschritt der Bereiche der digitalen Fabrikation, des Rapid Prototyping und des Ubiquitous Computing in Richtung des grĂ¶ĂŸeren Ziels, der Exploration interaktiver Objekte mit reichhaltigen Materialien als eine neue Generation von physischen Interfaces, bei

    Fine-grained Haptics: Sensing and Actuating Haptic Primary Colours (force, vibration, and temperature)

    Get PDF
    This thesis discusses the development of a multimodal, fine-grained visual-haptic system for teleoperation and robotic applications. This system is primarily composed of two complementary components: an input device known as the HaptiTemp sensor (combines “Haptics” and “Temperature”), which is a novel thermosensitive GelSight-like sensor, and an output device, an untethered multimodal finegrained haptic glove. The HaptiTemp sensor is a visuotactile sensor that can sense haptic primary colours known as force, vibration, and temperature. It has novel switchable UV markers that can be made visible using UV LEDs. The switchable markers feature is a real novelty of the HaptiTemp because it can be used in the analysis of tactile information from gel deformation without impairing the ability to classify or recognise images. The use of switchable markers in the HaptiTemp sensor is the solution to the trade-off between marker density and capturing high-resolution images using one sensor. The HaptiTemp sensor can measure vibrations by counting the number of blobs or pulses detected per unit time using a blob detection algorithm. For the first time, temperature detection was incorporated into a GelSight-like sensor, making the HaptiTemp sensor a haptic primary colours sensor. The HaptiTemp sensor can also do rapid temperature sensing with a 643 ms response time for the 31°C to 50°C temperature range. This fast temperature response of the HaptiTemp sensor is comparable to the withdrawal reflex response in humans. This is the first time a sensor can trigger a sensory impulse that can mimic a human reflex in the robotic community. The HaptiTemp sensor can also do simultaneous temperature sensing and image classification using a machine vision camera—the OpenMV Cam H7 Plus. This capability of simultaneous sensing and image classification has not been reported or demonstrated by any tactile sensor. The HaptiTemp sensor can be used in teleoperation because it can communicate or transmit tactile analysis and image classification results using wireless communication. The HaptiTemp sensor is the closest thing to the human skin in tactile sensing, tactile pattern recognition, and rapid temperature response. In order to feel what the HaptiTemp sensor is touching from a distance, a corresponding output device, an untethered multimodal haptic hand wearable, is developed to actuate the haptic primary colours sensed by the HaptiTemp sensor. This wearable can communicate wirelessly and has fine-grained cutaneous feedback to feel the edges or surfaces of the tactile images captured by the HaptiTemp sensor. This untethered multimodal haptic hand wearable has gradient kinesthetic force feedback that can restrict finger movements based on the force estimated by the HaptiTemp sensor. A retractable string from an ID badge holder equipped with miniservos that control the stiffness of the wire is attached to each fingertip to restrict finger movements. Vibrations detected by the HaptiTemp sensor can be actuated by the tapping motion of the tactile pins or by a buzzing minivibration motor. There is also a tiny annular Peltier device, or ThermoElectric Generator (TEG), with a mini-vibration motor, forming thermo-vibro feedback in the palm area that can be activated by a ‘hot’ or ‘cold’ signal from the HaptiTemp sensor. The haptic primary colours can also be embedded in a VR environment that can be actuated by the multimodal hand wearable. A VR application was developed to demonstrate rapid tactile actuation of edges, allowing the user to feel the contours of virtual objects. Collision detection scripts were embedded to activate the corresponding actuator in the multimodal haptic hand wearable whenever the tactile matrix simulator or hand avatar in VR collides with a virtual object. The TEG also gets warm or cold depending on the virtual object the participant has touched. Tests were conducted to explore virtual objects in 2D and 3D environments using Leap Motion control and a VR headset (Oculus Quest 2). Moreover, a fine-grained cutaneous feedback was developed to feel the edges or surfaces of a tactile image, such as the tactile images captured by the HaptiTemp sensor, or actuate tactile patterns in 2D or 3D virtual objects. The prototype is like an exoskeleton glove with 16 tactile actuators (tactors) on each fingertip, 80 tactile pins in total, made from commercially available P20 Braille cells. Each tactor can be controlled individually to enable the user to feel the edges or surfaces of images, such as the high-resolution tactile images captured by the HaptiTemp sensor. This hand wearable can be used to enhance the immersive experience in a virtual reality environment. The tactors can be actuated in a tapping manner, creating a distinct form of vibration feedback as compared to the buzzing vibration produced by a mini-vibration motor. The tactile pin height can also be varied, creating a gradient of pressure on the fingertip. Finally, the integration of the high-resolution HaptiTemp sensor, and the untethered multimodal, fine-grained haptic hand wearable is presented, forming a visuotactile system for sensing and actuating haptic primary colours. Force, vibration, and temperature sensing tests with corresponding force, vibration, and temperature actuating tests have demonstrated a unified visual-haptic system. Aside from sensing and actuating haptic primary colours, touching the edges or surfaces of the tactile images captured by the HaptiTemp sensor was carried out using the fine-grained cutaneous feedback of the haptic hand wearable

    The evaluation of a novel haptic machining VR-based process planning system using an original process planning usability method

    Get PDF
    This thesis provides an original piece of work and contribution to knowledge by creating a new process planning system; Haptic Aided Process Planning (HAPP). This system is based on the combination of haptics and virtual reality (VR). HAPP creates a simulative machining environment where Process plans are automatically generated from the real time logging of a user’s interaction. Further, through the application of a novel usability test methodology, a deeper study of how this approach compares to conventional process planning was undertaken. An abductive research approach was selected and an iterative and incremental development methodology chosen. Three development cycles were undertaken with evaluation studies carried out at the end of each. Each study, the pre-pilot, pilot and industrial, identified progressive refinements to both the usability of HAPP and the usability evaluation method itself. HAPP provided process planners with an environment similar to which they are already familiar. Visual images were used to represent tools and material whilst a haptic interface enabled their movement and positioning by an operator in a manner comparable to their native setting. In this way an intuitive interface was developed that allowed users to plan the machining of parts consisting of features that can be machined on a pillar drill, 21/2D axis milling machine or centre lathe. The planning activities included single or multiple set ups, fixturing and sequencing of cutting operations. The logged information was parsed and output to a process plan including route sheets, operation sheets, tool lists and costing information, in a human readable format. The system evaluation revealed that HAPP, from an expert planners perspective is perceived to be 70% more satisfying to use, 66% more efficient in completing process plans, primarily due to the reduced cognitive load, is more effective producing a higher quality output of information and is 20% more learnable than a traditional process planning approach

    Haptic Interface for the Simulation of Endovascular Interventions

    Get PDF
    Endovascular interventions are minimally invasive surgical procedures that are performed to diagnose and treat vascular diseases. These interventions use a combination of long and flexible instruments known as guidewire and catheter. A popular method of developing the skills required to manipulate the instruments successfully is through the use of virtual reality (VR) simulators. However, the interfaces of current VR simulators have several shortcomings due to limitations in the instrument tracking and haptic feedback systems design. A major challenge of developing physics-based training simulations of endovascular interventional procedures is to unobtrusively access the central, co-axial guidewire for tracking and haptics. This work sets out to explore the state of the art, to identify and develop novel solutions to this concentric occlusion problem, and to perform a validation of a proof of concept prototype. This multi port haptic interface prototype has been integrated with a 3-D virtual environment and features novel instrument tracking and haptic feedback actuation systems. The former involves the use of an optical sensor to detect guidewire movements through a clear catheter, whereas the latter utilises the placement of a customised electromagnetic actuator within the catheter hub. During the proof of concept validation process, both systems received positive reviews. Whilst the haptic interface prototype designed in this work has met the original objectives, there are still important aspects which need to be addressed to improve its content and face validity. With further development, the prototype has the potential to evolve and become a significant improvement over the haptic interfaces that exist today.Open Acces

    Digital Fabrication Approaches for the Design and Development of Shape-Changing Displays

    Get PDF
    Interactive shape-changing displays enable dynamic representations of data and information through physically reconfigurable geometry. The actuated physical deformations of these displays can be utilised in a wide range of new application areas, such as dynamic landscape and topographical modelling, architectural design, physical telepresence and object manipulation. Traditionally, shape-changing displays have a high development cost in mechanical complexity, technical skills and time/finances required for fabrication. There is still a limited number of robust shape-changing displays that go beyond one-off prototypes. Specifically, there is limited focus on low-cost/accessible design and development approaches involving digital fabrication (e.g. 3D printing). To address this challenge, this thesis presents accessible digital fabrication approaches that support the development of shape-changing displays with a range of application examples – such as physical terrain modelling and interior design artefacts. Both laser cutting and 3D printing methods have been explored to ensure generalisability and accessibility for a range of potential users. The first design-led content generation explorations show that novice users, from the general public, can successfully design and present their own application ideas using the physical animation features of the display. By engaging with domain experts in designing shape-changing content to represent data specific to their work domains the thesis was able to demonstrate the utility of shape-changing displays beyond novel systems and describe practical use-case scenarios and applications through rapid prototyping methods. This thesis then demonstrates new ways of designing and building shape-changing displays that goes beyond current implementation examples available (e.g. pin arrays and continuous surface shape-changing displays). To achieve this, the thesis demonstrates how laser cutting and 3D printing can be utilised to rapidly fabricate deformable surfaces for shape-changing displays with embedded electronics. This thesis is concluded with a discussion of research implications and future direction for this work

    HaptiTemp: A Next-Generation Thermosensitive GelSight-like Visuotactile Sensor

    Get PDF
    This study describes the creation of a new type of compact skin-like silicone-based thermosensitive visuotactile sensor based on GelSight technology. The easy integration of this novel sensor into a complex visuotactile system capable of very rapid detection of temperature change (30°C/s) is unique in providing a system that parallels the withdrawal reflex of the human autonomic system to extreme heat. To the best of authors’ awareness, this is the first time a sensor that can trigger a sensory impulse like a withdrawal reflex of humans in robotic community. To attain this, we used thermochromic pigments color blue, orange, and black with a threshold of 31°C, 43°C, and 50°C, respectively on the gel material. Each pigment has the property of becoming translucent when its temperature threshold is reached, making it possible to stack thermochromic pigments of different colors and thresholds. The pigments were air-brushed on a low-cost commercially available transparent silicone sponge. We used MobileNetV2 and transfer learning to simulate tactile preprocessing in order to recognize five different objects. The new thermosensitive visuotactile sensor helped to achieve 97.3% tactile image classification accuracy of five different objects. Our novel thermosensitive visuotactile sensor could be of benefit in material texture analysis, telerobotics, space exploration, and medical applications

    AeroVR : immersive visualization system for aerospace design and digital twinning in virtual reality

    Get PDF
    One of today’s most propitious immersive technologies is virtual reality (VR). This term is colloquially associated with headsets that transport users to a bespoke, built-for-purpose immersive 3D virtual environment. It has given rise to the field of immersive analytics—a new field of research that aims to use immersive technologies for enhancing and empowering data analytics. However, in developing such a new set of tools, one has to ask whether the move from standard hardware setup to a fully immersive 3D environment is justified—both in terms of efficiency and development costs. To this end, in this paper, we present AeroVR—an immersive aerospace design environment with the objective of aiding the component aerodynamic design process by interactively visualizing performance and geometry. We decompose the design of such an environment into function structures, identify the primary and secondary tasks, present an implementation of the system, and verify the interface in terms of usability and expressiveness. We deploy AeroVR on a prototypical design study of a compressor blade for an engine

    Introducing liquid haptics in high bandwidth human computer interfaces

    Get PDF
    Thesis (M.S.)--Massachusetts Institute of Technology, Program in Media Arts & Sciences, 1998.Includes bibliographical references (leaves 86-91).Tom White.M.S
    • 

    corecore