31 research outputs found

    A randomised trial of observational learning from 2D and 3D models in robotically assisted surgery

    Get PDF
    This is the final version of the article. Available from the publisher via the DOI in this record.BACKGROUND: Advances in 3D technology mean that both robotic surgical devices and surgical simulators can now incorporate stereoscopic viewing capabilities. While depth information may benefit robotic surgical performance, it is unclear whether 3D viewing also aids skill acquisition when learning from observing others. As observational learning plays a major role in surgical skills training, this study aimed to evaluate whether 3D viewing provides learning benefits in a robotically assisted surgical task. METHODS: 90 medical students were assigned to either (1) 2D or (2) 3D observation of a consultant surgeon performing a training task on the daVinci S robotic system, or (3) a no observation control, in a randomised parallel design. Subsequent performance and instrument movement metrics were assessed immediately following observation and at one-week retention. RESULTS: Both 2D and 3D groups outperformed no observation controls following the observation intervention (ps < 0.05), but there was no difference between 2D and 3D groups at any of the timepoints. There was also no difference in movement parameters between groups. CONCLUSIONS: While 3D viewing systems may have beneficial effects for surgical performance, these results suggest that depth information has limited utility during observational learning of surgical skills in novices. The task constraints and end goals may provide more important information for learning than the relative motion of surgical instruments in 3D space.This research was supported by an Intuitive Surgical grant awarded to Dr G Buckingha

    Exploring the use of sensors to measure behavioral interactions: An experimental evaluation of using hand trajectories

    Get PDF
    Humans appear to be sensitive to relative small changes in their surroundings. These changes are often initially perceived as irrelevant, but they can cause significant changes in behavior. However, how exactly people's behavior changes is often hard to quantify. A reliable and valid tool is needed in order to address such a question, ideally measuring an important point of interaction, such as the hand. Wearable-body-sensor systems can be used to obtain valuable, behavioral information. These systems are particularly useful for assessing functional interactions that occur between the endpoints of the upper limbs and our surroundings. A new method is explored that consists of computing hand position using a wearable sensor system and validating it against a gold standard reference measurement (optical tracking device). Initial outcomes related well to the gold standard measurements (r = 0.81) showing an acceptable average root mean square error of 0.09 meters. Subsequently, the use of this approach was further investigated by measuring differences in motor behavior, in response to a changing environment. Three subjects were asked to perform a water pouring task with three slightly different containers. Wavelet analysis was introduced to assess how motor consistency was affected by these small environmental changes. Results showed that the behavioral motor adjustments to a variable environment could be assessed by applying wavelet coherence techniques. Applying these procedures in everyday life, combined with correct research methodologies, can assist in quantifying how environmental changes can cause alterations in our motor behavior. © 2014 Bergmann et al
    corecore