2,447 research outputs found

    Sonification as a Reliable Alternative to Conventional Visual Surgical Navigation

    Full text link
    Despite the undeniable advantages of image-guided surgical assistance systems in terms of accuracy, such systems have not yet fully met surgeons' needs or expectations regarding usability, time efficiency, and their integration into the surgical workflow. On the other hand, perceptual studies have shown that presenting independent but causally correlated information via multimodal feedback involving different sensory modalities can improve task performance. This article investigates an alternative method for computer-assisted surgical navigation, introduces a novel sonification methodology for navigated pedicle screw placement, and discusses advanced solutions based on multisensory feedback. The proposed method comprises a novel sonification solution for alignment tasks in four degrees of freedom based on frequency modulation (FM) synthesis. We compared the resulting accuracy and execution time of the proposed sonification method with visual navigation, which is currently considered the state of the art. We conducted a phantom study in which 17 surgeons executed the pedicle screw placement task in the lumbar spine, guided by either the proposed sonification-based or the traditional visual navigation method. The results demonstrated that the proposed method is as accurate as the state of the art while decreasing the surgeon's need to focus on visual navigation displays instead of the natural focus on surgical tools and targeted anatomy during task execution

    MIXED REALITY IN MEDICAL SIMULATION: A COMPREHENSIVE DESIGN METHODOLOGY

    Get PDF
    AbstractIn the medical education field, the use of highly sophisticated simulators and extended reality (XR) simulations allow training complex procedures and acquiring new knowledge and attitudes. XR is considered useful for the enhancement of healthcare education; however, several issues need further research.The main aim of this study is to define a comprehensive method to design and optimize every kind of simulator and simulation, integrating all the relevant elements concerning the scenario design and prototype development.A complete framework for the design of any kind of advanced clinical simulation is proposed and it has been applied to realize a mixed reality (MR) prototype for the simulation of the rachicentesis. The purpose of the MR application is to immerse the trainee in a more realistic environment and to put him/her under pressure during the simulation, as in real practice.The application was tested with two different devices: the headset Vox Gear Plus for smartphone and the Microsoft Hololens. Eighteen students of the 6th year of Medicine and Surgery Course were enrolled in the study. Results show the comparison of user experience related to the two different devices and simulation performance using the Hololens

    Augmented Reality: Mapping Methods and Tools for Enhancing the Human Role in Healthcare HMI

    Get PDF
    Background: Augmented Reality (AR) represents an innovative technology to improve data visualization and strengthen the human perception. Among Human–Machine Interaction (HMI), medicine can benefit most from the adoption of these digital technologies. In this perspective, the literature on orthopedic surgery techniques based on AR was evaluated, focusing on identifying the limitations and challenges of AR-based healthcare applications, to support the research and the development of further studies. Methods: Studies published from January 2018 to December 2021 were analyzed after a comprehensive search on PubMed, Google Scholar, Scopus, IEEE Xplore, Science Direct, and Wiley Online Library databases. In order to improve the review reporting, the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines were used. Results: Authors selected sixty-two articles meeting the inclusion criteria, which were categorized according to the purpose of the study (intraoperative, training, rehabilitation) and according to the surgical procedure used. Conclusions: AR has the potential to improve orthopedic training and practice by providing an increasingly human-centered clinical approach. Further research can be addressed by this review to cover problems related to hardware limitations, lack of accurate registration and tracking systems, and absence of security protocols

    The Brainstem Auditory Evoked Response in Old Versus Young Horses

    Get PDF
    A brainstem auditory evoked response (BAER) is an objective test that measures changes in voltage in the ongoing electroencephalogram (EEG) in response to the presentation of acoustic stimuli, and is observed as a waveform with a known series of peaks. While the gold standard for measuring hearing sensitivity in humans is behavioral testing during which the listener provides a behavioral response to a sound (e.g., raising a hand or pressing a button), objective testing allows clinicians to estimate behavioral hearing sensitivity when behavioral testing is not possible. Therefore, BAER testing has been used as a tool to measure hearing in various animal species. In humans, aging typically affects the brainstem auditory evoked response, producing waveforms with decreased amplitudes, wave V thresholds at higher stimulus intensities, and increased latency of wave V responses. The purpose of this study was to evaluate the effects of aging on the BAER in equines. The following research questions were investigated: Can brainstem auditory evoked responses be identified and replicated for older horses? If they can be identified, are there differences in response characteristics between young and old horses who have not participated in or been exposed to noisy situations in a convenience sample of horses? It was hypothesized that the older group of horses would exhibit poorer thresholds, poorer morphology, increased latencies, and a decrease in amplitude of waves compared to the younger group. Data were collected and analyzed from ten test subjects, five old (\u3e20 years) and four young (\u3c7 years) horses. Data obtained from one of the older horses was not included in the data analysis due to excessive noise from tooth grinding that obscured the waveforms. BAER testing was performed and waveforms were identified and replicated. Peak amplitudes, peak latencies, and thresholds were descriptively and statistically analyzed. There were no statistically significant differences between the two test groups. It is unclear why age-related differences were not observed. It is possible that what is considered age-related hearing loss (presbycusis) in humans, is the cumulative effect of noise, ototoxicity, and other environmental factors. It is also possible that the click stimulus did not elicit responses from the frequency range most sensitive to age- related changes in horses. Future research should explore using other stimuli, such as high-frequency tonebursts, to evaluate other frequency ranges, determine the feasibility of sedating subjects to reduce artifact, evaluate test-retest reliability of the brainstem auditory evoked response in horses, and include brainstem auditory evoked response testing on a group of horses with noise exposure to compare with the results from this study

    Interactive Virtual Suturing Simulations: Enhancement of Student Learning in Veterinary Medicine

    Get PDF
    A capstone submitted in partial fulfillment of the requirements for the degree of Doctor of Education in the College of Education at Morehead State University by Christine B. Boyd and Amy J. Staton on December 29, 2013

    Augmented reality (AR) for surgical robotic and autonomous systems: State of the art, challenges, and solutions

    Get PDF
    Despite the substantial progress achieved in the development and integration of augmented reality (AR) in surgical robotic and autonomous systems (RAS), the center of focus in most devices remains on improving end-effector dexterity and precision, as well as improved access to minimally invasive surgeries. This paper aims to provide a systematic review of different types of state-of-the-art surgical robotic platforms while identifying areas for technological improvement. We associate specific control features, such as haptic feedback, sensory stimuli, and human-robot collaboration, with AR technology to perform complex surgical interventions for increased user perception of the augmented world. Current researchers in the field have, for long, faced innumerable issues with low accuracy in tool placement around complex trajectories, pose estimation, and difficulty in depth perception during two-dimensional medical imaging. A number of robots described in this review, such as Novarad and SpineAssist, are analyzed in terms of their hardware features, computer vision systems (such as deep learning algorithms), and the clinical relevance of the literature. We attempt to outline the shortcomings in current optimization algorithms for surgical robots (such as YOLO and LTSM) whilst providing mitigating solutions to internal tool-to-organ collision detection and image reconstruction. The accuracy of results in robot end-effector collisions and reduced occlusion remain promising within the scope of our research, validating the propositions made for the surgical clearance of ever-expanding AR technology in the future

    Motor learning induced neuroplasticity in minimally invasive surgery

    Get PDF
    Technical skills in surgery have become more complex and challenging to acquire since the introduction of technological aids, particularly in the arena of Minimally Invasive Surgery. Additional challenges posed by reforms to surgical careers and increased public scrutiny, have propelled identification of methods to assess and acquire MIS technical skills. Although validated objective assessments have been developed to assess motor skills requisite for MIS, they poorly understand the development of expertise. Motor skills learning, is indirectly observable, an internal process leading to relative permanent changes in the central nervous system. Advances in functional neuroimaging permit direct interrogation of evolving patterns of brain function associated with motor learning due to the property of neuroplasticity and has been used on surgeons to identify the neural correlates for technical skills acquisition and the impact of new technology. However significant gaps exist in understanding neuroplasticity underlying learning complex bimanual MIS skills. In this thesis the available evidence on applying functional neuroimaging towards assessment and enhancing operative performance in the field of surgery has been synthesized. The purpose of this thesis was to evaluate frontal lobe neuroplasticity associated with learning a complex bimanual MIS skill using functional near-infrared spectroscopy an indirect neuroimaging technique. Laparoscopic suturing and knot-tying a technically challenging bimanual skill is selected to demonstrate learning related reorganisation of cortical behaviour within the frontal lobe by shifts in activation from the prefrontal cortex (PFC) subserving attention to primary and secondary motor centres (premotor cortex, supplementary motor area and primary motor cortex) in which motor sequences are encoded and executed. In the cross-sectional study, participants of varying expertise demonstrate frontal lobe neuroplasticity commensurate with motor learning. The longitudinal study involves tracking evolution in cortical behaviour of novices in response to receipt of eight hours distributed training over a fortnight. Despite novices achieving expert like performance and stabilisation on the technical task, this study demonstrates that novices displayed persistent PFC activity. This study establishes for complex bimanual tasks, that improvements in technical performance do not accompany a reduced reliance in attention to support performance. Finally, least-squares support vector machine is used to classify expertise based on frontal lobe functional connectivity. Findings of this thesis demonstrate the value of interrogating cortical behaviour towards assessing MIS skills development and credentialing.Open Acces

    Sensory substitution for force feedback recovery: A perception experimental study

    Get PDF
    Robotic-assisted surgeries are commonly used today as a more efficient alternative to traditional surgical options. Both surgeons and patients benefit from those systems, as they offer many advantages, including less trauma and blood loss, fewer complications, and better ergonomics. However, a remaining limitation of currently available surgical systems is the lack of force feedback due to the teleoperation setting, which prevents direct interaction with the patient. Once the force information is obtained by either a sensing device or indirectly through vision-based force estimation, a concern arises on how to transmit this information to the surgeon. An attractive alternative is sensory substitution, which allows transcoding information from one sensory modality to present it in a different sensory modality. In the current work, we used visual feedback to convey interaction forces to the surgeon. Our overarching goal was to address the following question: How should interaction forces be displayed to support efficient comprehension by the surgeon without interfering with the surgeon’s perception and workflow during surgery? Until now, the use the visual modality for force feedback has not been carefully evaluated. For this reason, we conducted an experimental study with two aims: (1) to demonstrate the potential benefits of using this modality and (2) to understand the surgeons’ perceptual preferences. The results derived from our study of 28 surgeons revealed a strong positive acceptance of the users (96%) using this modality. Moreover, we found that for surgeons to easily interpret the information, their mental model must be considered, meaning that the design of the visualizations should fit the perceptual and cognitive abilities of the end user. To our knowledge, this is the first time that these principles have been analyzed for exploring sensory substitution in medical robotics. Finally, we provide user-centered recommendations for the design of visual displays for robotic surgical systems.Peer ReviewedPostprint (author's final draft

    The Effect of an Unconscious Auditory Stimulus on Pilot Performance under Varying Instrument Flying Conditions

    Get PDF
    Human error remains a significant contributing factor with respect to accidents in civil air transportation. It is therefore crucial to establish avenues by which performance on the flightdeck can be enhanced under conditions of distress. The purpose of this study was to examine whether an unconscious auditory stimulus (UAS) could enhance pilot performance under varying instrument flight (IFR) conditions on the aircraft flightdeck. Forty IFR student pilots underwent two eight-minute simulated flights, whereupon they were presented with different IFR weather conditions. During the trial, the experimental group listened to a UAS, whereas the control group listened to white noise (WN). Performance was measured based on the deviation from the localizer (LOC), the glide slope (GS), and the air speed (AS). It was hypothesized that the UAS would assist in enhancing pilot performance under varying IFR weather conditions, and that overall good weather conditions would degrade performance less than poor weather conditions. The results of this experiment did not support the hypotheses. Possible explanations are presented in the discussion section
    • …
    corecore