3,900 research outputs found
Digital Fabrication Approaches for the Design and Development of Shape-Changing Displays
Interactive shape-changing displays enable dynamic representations of data and information through physically reconfigurable geometry. The actuated physical deformations of these displays can be utilised in a wide range of new application areas, such as dynamic landscape and topographical modelling, architectural design, physical telepresence and object manipulation. Traditionally, shape-changing displays have a high development cost in mechanical complexity, technical skills and time/finances required for fabrication. There is still a limited number of robust shape-changing displays that go beyond one-off prototypes. Specifically, there is limited focus on low-cost/accessible design and development approaches involving digital fabrication (e.g. 3D printing). To address this challenge, this thesis presents accessible digital fabrication approaches that support the development of shape-changing displays with a range of application examples – such as physical terrain modelling and interior design artefacts. Both laser cutting and 3D printing methods have been explored to ensure generalisability and accessibility for a range of potential users. The first design-led content generation explorations show that novice users, from the general public, can successfully design and present their own application ideas using the physical animation features of the display. By engaging with domain experts in designing shape-changing content to represent data specific to their work domains the thesis was able to demonstrate the utility of shape-changing displays beyond novel systems and describe practical use-case scenarios and applications through rapid prototyping methods. This thesis then demonstrates new ways of designing and building shape-changing displays that goes beyond current implementation examples available (e.g. pin arrays and continuous surface shape-changing displays). To achieve this, the thesis demonstrates how laser cutting and 3D printing can be utilised to rapidly fabricate deformable surfaces for shape-changing displays with embedded electronics. This thesis is concluded with a discussion of research implications and future direction for this work
Real-time hybrid cutting with dynamic fluid visualization for virtual surgery
It is widely accepted that a reform in medical teaching must be made to meet today's high volume training requirements. Virtual simulation offers a potential method of providing such trainings and some current medical training simulations integrate haptic and visual feedback to enhance procedure learning. The purpose of this project is to explore the capability of Virtual Reality (VR) technology to develop a training simulator for surgical cutting and bleeding in a general surgery
3D Printed Deformable Surfaces for Shape-Changing Displays
We use interlinked 3D printed panels to fabricate deformable surfaces that are specifically designed for shape-changing displays. Our exploration of 3D printed deformable surfaces, as a fabrication technique for shape-changing displays, shows new and diverse forms of shape output, visualizations, and interaction capabilities. This article describes our general design and fabrication approach, the impact of varying surface design parameters, and a demonstration of possible application examples. We conclude by discussing current limitations and future directions for this work
Touch and deformation perception of soft manipulators with capacitive e-skins and deep learning
Tactile sensing in soft robots remains particularly challenging because of
the coupling between contact and deformation information which the sensor is
subject to during actuation and interaction with the environment. This often
results in severe interference and makes disentangling tactile sensing and
geometric deformation difficult. To address this problem, this paper proposes a
soft capacitive e-skin with a sparse electrode distribution and deep learning
for information decoupling. Our approach successfully separates tactile sensing
from geometric deformation, enabling touch recognition on a soft pneumatic
actuator subject to both internal (actuation) and external (manual handling)
forces. Using a multi-layer perceptron, the proposed e-skin achieves 99.88\%
accuracy in touch recognition across a range of deformations. When complemented
with prior knowledge, a transformer-based architecture effectively tracks the
deformation of the soft actuator. The average distance error in positional
reconstruction of the manipulator is as low as 2.9052.207 mm, even under
operative conditions with different inflation states and physical contacts
which lead to additional signal variations and consequently interfere with
deformation tracking. These findings represent a tangible way forward in the
development of e-skins that can endow soft robots with proprioception and
exteroception
- …