6,045 research outputs found
The Analysis of design and manufacturing tasks using haptic and immersive VR - Some case studies
The use of virtual reality in interactive design and manufacture has been researched extensively but the practical application of this technology in industry is still very much in its infancy. This is surprising as one would have expected that, after some 30 years of research commercial applications of interactive design or manufacturing planning and analysis would be widespread throughout the product design domain. One of the major but less well known advantages of VR technology is that logging the user gives a great deal of rich data which can be used to automatically generate designs or manufacturing instructions, analyse design and manufacturing tasks, map engineering processes and, tentatively, acquire expert knowledge. The authors feel that the benefits of VR in these areas have not been fully disseminated to the wider industrial community and - with the advent of cheaper PC-based VR solutions - perhaps a wider appreciation of the capabilities of this type of technology may encourage companies to adopt VR solutions for some of their product design processes. With this in mind, this paper will describe in detail applications of haptics in assembly demonstrating how user task logging can lead to the analysis of design and manufacturing tasks at a level of detail not previously possible as well as giving usable engineering outputs. The haptic 3D VR study involves the use of a Phantom and 3D system to analyse and compare this technology against real-world user performance. This work demonstrates that the detailed logging of tasks in a virtual environment gives considerable potential for understanding how virtual tasks can be mapped onto their real world equivalent as well as showing how haptic process plans can be generated in a similar manner to the conduit design and assembly planning HMD VR tool reported in PART A. The paper concludes with a view as to how the authors feel that the use of VR systems in product design and manufacturing should evolve in order to enable the industrial adoption of this technology in the future
A haptic-enabled multimodal interface for the planning of hip arthroplasty
Multimodal environments help fuse a diverse range of sensory modalities, which is particularly important when integrating the complex data involved in surgical preoperative planning. The authors apply a multimodal interface for preoperative planning of hip arthroplasty with a user interface that integrates immersive stereo displays and haptic modalities. This article overviews this multimodal application framework and discusses the benefits of incorporating the haptic modality in this area
Interactive Chemical Reactivity Exploration
Elucidating chemical reactivity in complex molecular assemblies of a few
hundred atoms is, despite the remarkable progress in quantum chemistry, still a
major challenge. Black-box search methods to find intermediates and
transition-state structures might fail in such situations because of the
high-dimensionality of the potential energy surface. Here, we propose the
concept of interactive chemical reactivity exploration to effectively introduce
the chemist's intuition into the search process. We employ a haptic pointer
device with force-feedback to allow the operator the direct manipulation of
structures in three dimensions along with simultaneous perception of the
quantum mechanical response upon structure modification as forces. We elaborate
on the details of how such an interactive exploration should proceed and which
technical difficulties need to be overcome. All reactivity-exploration concepts
developed for this purpose have been implemented in the Samson programming
environment.Comment: 36 pages, 14 figure
VISIO-HAPTIC DEFORMABLE MODEL FOR HAPTIC DOMINANT PALPATION SIMULATOR
Vision and haptic are two most important modalities in a medical simulation. While
visual cues assist one to see his actions when performing a medical procedure, haptic
cues enable feeling the object being manipulated during the interaction. Despite their
importance in a computer simulation, the combination of both modalities has not been
adequately assessed, especially that in a haptic dominant environment. Thus, resulting
in poor emphasis in resource allocation management in terms of effort spent in
rendering the two modalities for simulators with realistic real-time interactions.
Addressing this problem requires an investigation on whether a single modality
(haptic) or a combination of both visual and haptic could be better for learning skills
in a haptic dominant environment such as in a palpation simulator. However, before
such an investigation could take place one main technical implementation issue in
visio-haptic rendering needs to be addresse
Multi-touch 3D Exploratory Analysis of Ocean Flow Models
Modern ocean flow simulations are generating increasingly complex, multi-layer 3D ocean flow models. However, most researchers are still using traditional 2D visualizations to visualize these models one slice at a time. Properly designed 3D visualization tools can be highly effective for revealing the complex, dynamic flow patterns and structures present in these models. However, the transition from visualizing ocean flow patterns in 2D to 3D presents many challenges, including occlusion and depth ambiguity. Further complications arise from the interaction methods required to navigate, explore, and interact with these 3D datasets. We present a system that employs a combination of stereoscopic rendering, to best reveal and illustrate 3D structures and patterns, and multi-touch interaction, to allow for natural and efficient navigation and manipulation within the 3D environment. Exploratory visual analysis is facilitated through the use of a highly-interactive toolset which leverages a smart particle system. Multi-touch gestures allow users to quickly position dye emitting tools within the 3D model. Finally, we illustrate the potential applications of our system through examples of real world significance
Design and Experimental Evaluation of a Haptic Robot-Assisted System for Femur Fracture Surgery
In the face of challenges encountered during femur fracture surgery, such as
the high rates of malalignment and X-ray exposure to operating personnel,
robot-assisted surgery has emerged as an alternative to conventional
state-of-the-art surgical methods. This paper introduces the development of
Robossis, a haptic system for robot-assisted femur fracture surgery. Robossis
comprises a 7-DOF haptic controller and a 6-DOF surgical robot. A unilateral
control architecture is developed to address the kinematic mismatch and the
motion transfer between the haptic controller and the Robossis surgical robot.
A real-time motion control pipeline is designed to address the motion transfer
and evaluated through experimental testing. The analysis illustrates that the
Robossis surgical robot can adhere to the desired trajectory from the haptic
controller with an average translational error of 0.32 mm and a rotational
error of 0.07 deg. Additionally, a haptic rendering pipeline is developed to
resolve the kinematic mismatch by constraining the haptic controller (user
hand) movement within the permissible joint limits of the Robossis surgical
robot. Lastly, in a cadaveric lab test, the Robossis system assisted surgeons
during a mock femur fracture surgery. The result shows that Robossis can
provide an intuitive solution for surgeons to perform femur fracture surgery.Comment: This paper is to be submitted to an IEEE journa
- …