8,860 research outputs found

    Computer surgery 3D simulations for a new teaching-learning model

    Get PDF
    Using 3D computer simulations for training surgeons is not new. Using e-learning for improving students knowledge acquisition is not new. What we propose is to use 3D computer simulations in such a versatile way that those simulations could act as learning objects designed directly by those who own the experience we want to be transmitted. In order to achieve this goal, it is necessary to create a model in charge of communications between the learning objects and the simulation. This model ensures that, on the one hand, the simulation offers an interface to the learning process stable enough not to be affected by every small change. On the other hand, the model also ensures that the simulation offers an interface complete enough for adopting any change in the learning process. The key to solve this contradiction is to take the behavior of the simulation objects out of their control leaving in them just their very basic behavior. This paper presents the problem and the design proposed to solve it in a more detailed wa

    Proof of concept of a workflow methodology for the creation of basic canine head anatomy veterinary education tool using augmented reality

    Get PDF
    Neuroanatomy can be challenging to both teach and learn within the undergraduate veterinary medicine and surgery curriculum. Traditional techniques have been used for many years, but there has now been a progression to move towards alternative digital models and interactive 3D models to engage the learner. However, digital innovations in the curriculum have typically involved the medical curriculum rather than the veterinary curriculum. Therefore, we aimed to create a simple workflow methodology to highlight the simplicity there is in creating a mobile augmented reality application of basic canine head anatomy. Using canine CT and MRI scans and widely available software programs, we demonstrate how to create an interactive model of head anatomy. This was applied to augmented reality for a popular Android mobile device to demonstrate the user-friendly interface. Here we present the processes, challenges and resolutions for the creation of a highly accurate, data based anatomical model that could potentially be used in the veterinary curriculum. This proof of concept study provides an excellent framework for the creation of augmented reality training products for veterinary education. The lack of similar resources within this field provides the ideal platform to extend this into other areas of veterinary education and beyond

    A new mini-navigation tool allows accurate component placement during anterior total hip arthroplasty.

    Get PDF
    Introduction: Computer-assisted navigation systems have been explored in total hip arthroplasty (THA) to improve component positioning. While these systems traditionally rely on anterior pelvic plane registration, variances in soft tissue thickness overlying anatomical landmarks can lead to registration error, and the supine coronal plane has instead been proposed. The purpose of this study was to evaluate the accuracy of a novel navigation tool, using registration of the anterior pelvic plane or supine coronal plane during simulated anterior THA. Methods: Measurements regarding the acetabular component position, and changes in leg length and offset were recorded. Benchtop phantoms and target measurement values commonly seen in surgery were used for analysis. Measurements for anteversion and inclination, and changes in leg length and offset were recorded by the navigation tool and compared with the known target value of the simulation. Pearson\u27s Results: The device accurately measured cup position and leg length measurements to within 1° and 1 mm of the known target values, respectively. Across all simulations, there was a strong, positive relationship between values obtained by the device and the known target values ( Conclusion: The preliminary findings of this study suggest that the novel navigation tool tested is a potentially viable tool to improve the accuracy of component placement during THA using the anterior approach

    From ‘hands up’ to ‘hands on’: harnessing the kinaesthetic potential of educational gaming

    Get PDF
    Traditional approaches to distance learning and the student learning journey have focused on closing the gap between the experience of off-campus students and their on-campus peers. While many initiatives have sought to embed a sense of community, create virtual learning environments and even build collaborative spaces for team-based assessment and presentations, they are limited by technological innovation in terms of the types of learning styles they support and develop. Mainstream gaming development – such as with the Xbox Kinect and Nintendo Wii – have a strong element of kinaesthetic learning from early attempts to simulate impact, recoil, velocity and other environmental factors to the more sophisticated movement-based games which create a sense of almost total immersion and allow untethered (in a technical sense) interaction with the games’ objects, characters and other players. Likewise, gamification of learning has become a critical focus for the engagement of learners and its commercialisation, especially through products such as the Wii Fit. As this technology matures, there are strong opportunities for universities to utilise gaming consoles to embed levels of kinaesthetic learning into the student experience – a learning style which has been largely neglected in the distance education sector. This paper will explore the potential impact of these technologies, to broadly imagine the possibilities for future innovation in higher education

    Immersive Visualization in Biomedical Computational Fluid Dynamics and Didactic Teaching and Learning

    Get PDF
    Virtual reality (VR) can stimulate active learning, critical thinking, decision making and improved performance. It requires a medium to show virtual content, which is called a virtual environment (VE). The MARquette Visualization Lab (MARVL) is an example of a VE. Robust processes and workflows that allow for the creation of content for use within MARVL further increases the userbase for this valuable resource. A workflow was created to display biomedical computational fluid dynamics (CFD) and complementary data in a wide range of VE’s. This allows a researcher to study the simulation in its natural three-dimensional (3D) morphology. In addition, it is an exciting way to extract more information from CFD results by taking advantage of improved depth cues, a larger display canvas, custom interactivity, and an immersive approach that surrounds the researcher. The CFD to VR workflow was designed to be basic enough for a novice user. It is also used as a tool to foster collaboration between engineers and clinicians. The workflow aimed to support results from common CFD software packages and across clinical research areas. ParaView, Blender and Unity were used in the workflow to take standard CFD files and process them for viewing in VR. Designated scripts were written to automate the steps implemented in each software package. The workflow was successfully completed across multiple biomedical vessels, scales and applications including: the aorta with application to congenital cardiovascular disease, the Circle of Willis with respect to cerebral aneurysms, and the airway for surgical treatment planning. The workflow was completed by novice users in approximately an hour. Bringing VR further into didactic teaching within academia allows students to be fully immersed in their respective subject matter, thereby increasing the students’ sense of presence, understanding and enthusiasm. MARVL is a space for collaborative learning that also offers an immersive, virtual experience. A workflow was created to view PowerPoint presentations in 3D using MARVL. A resulting Immersive PowerPoint workflow used PowerPoint, Unity and other open-source software packages to display the PowerPoint presentations in 3D. The Immersive PowerPoint workflow can be completed in under thirty minutes
    corecore