63 research outputs found
Tactile signatures and hand motion intent recognition for wearable assistive devices
Within the field of robotics and autonomous systems where there is a human in the loop, intent recognition plays an important role. This is especially true for wearable assistive devices used for rehabilitation, particularly post-stroke recovery. This paper reports results on the use of tactile patterns to detect weak muscle contractions in the forearm while at the same time associating these patterns with the muscle synergies during different grips. To investigate this concept, a series of experiments with healthy participants were carried out using a tactile arm brace (TAB) on the forearm while performing four different types of grip. The expected force patterns were established by analysing the muscle synergies of the four grip types and the forearm physiology. The results showed that the tactile signatures of the forearm recorded on the TAB align with the anticipated force patterns. Furthermore, a linear separability of the data across all four grip types was identified. Using the TAB data, machine learning algorithms achieved a 99% classification accuracy. The TAB results were highly comparable to a similar commercial intent recognition system based on a surface electromyography (sEMG) sensing
Reassembly of fractured object using fragment topology
This work presents our results on reassembly of broken objects using a newly developed fragment topology and feature extraction methodology. The reassembly of broken objects is a common problem in different domains including computeraided bone fracture reduction and reassembly of broken artefacts . The new fragment topology combines information from intact and fractured region boundaries to reduce possible correspondences between the fragments and optimise our iterative matching process. Experiments performed on different multifragment objects show that the proposed topology can be effectively applied, completing the process in a small number of iterations and with average alignment error 0.12mm
A novel mechanical design of a wearable fingertip haptic device for remote meniscus palpation
A pneumatic model of a fingertip haptic device (FHD) had been previously tested in virtual reality allowing the perception of different materials with a promising result. However, numerous drawbacks were noted in this design, including bulky size, less portability, and discomfort. In this paper, FHD is redesigned to provide haptic feedback for human meniscus palpation. A user study was performed to evaluate the effectiveness of the redesigned FHD. The study showed that the redesigned model could successfully provide a more evident perception of different stiffness levels but it compromised the comfort of the user when mounted on the finger for long periods
FORCE-TORQUE MEASUREMENT SYSTEM FOR FRACTURE SURGERY
One of the more difficult tasks in surgery is to apply the optimal instrument forces and torques necessary to conduct an operation without damaging the tissue of the patient. This is especially problematic in surgical robotics, where force-feedback is totally eliminated. Thus, force sensing instruments emerge as a critical need for improving safety and surgical outcome. We propose a new measurement system that can be used in real fracture surgeries to generate quantitative knowledge of forces/torques applied by surgeon on tissues.We instrumented a periosteal elevator with a 6-DOF load-cell in order to measure forces/torques applied by the surgeons on live tissues during fracture surgeries. Acquisition software was developed in LabView to acquire force/torque data together with synchronised visual information (USB camera) of the tip interacting with the tissue, and surgeon voice recording (microphone) describing the actual procedure. Measurement system and surgical protocol were designed according to patient safety and sterilisation standards.The developed technology was tested in a pilot study during real orthopaedic surgery (consisting of removing a metal plate from the femur shaft of a patient) resulting reliable and usable. As demonstrated by subsequent data analysis, coupling force/torque data with video and audio information produced quantitative knowledge of forces/torques applied by the surgeon during the surgery. The outlined approach will be used to perform intensive force measurements during orthopaedic surgeries. The generated quantitative knowledge will be used to design a force controller and optimised actuators for a robot-assisted fracture surgery system under development at the Bristol Robotics Laboratory
Towards Robot-Assisted Fracture Surgery For Intra-Articular Joint Fractures
Background Treating fractures is expensive and includes a long post-operative care. Intra-articular fractures are often treated with open surgery that require massive soft tissue incisions, long healing time and are often accompanied by deep wound infections. Minimally invasive surgery (MIS) is an alternative to this but when performed by surgeons and supported by X-rays does not achieve the required accuracy of surgical treatment. Methods Functional and non-functional requirements of the system were established by conducting interviews with orthopaedic surgeons and attending fracture surgeries at Bristol Royal Infirmary to gain first-hand experience of the complexities involved. A robot-assisted fracture system (RAFS) has been designed and built for a distal femur fracture but can generally serve as a platform for other fracture types. Results The RAFS system has been tested in BRL and the individual robots can achieve the required level of reduction positional accuracy (less than 1mm translational and 5 degrees of rotational accuracy). The system can simultaneously move two fragments. The positioning tests have been made on Sawbones. Conclusions The proposed approach is providing an optimal solution by merging the fracture reduction knowledge of the orthopaedic surgeon and the robotic system's precision in 3D
Robot-Bone Attachment Device for Robot-Assisted Percutaneous Bone Fragment Manipulation
The treatment of joint-fractures is a common task in orthopaedic surgery causing considerable health costs and patient disabilities. Percutaneous techniques have been developed to mitigate the problems related to open surgery (e.g. soft tissue damage), although their application to joint-fractures is limited by the sub-optimal intra-operative imaging (2D-fluoroscopy) and by the high forces involved. Our earlier research toward improving percutaneous reduction of intra-articular fractures has resulted in the creation of a robotic system prototype, i.e. RAFS (Robot-Assisted Fracture Surgery) system.
We propose a robot-bone attachment device for percutaneous bone manipulation, which can be anchored to the bone fragment through one small incision, ensuring the required stability and reducing the ābiological costā of the procedure. The device has been evaluated through the reduction of 9 distal femur fractures on human cadavers using the RAFS system
Image-Based Robotic System for Enhanced Minimally Invasive Intra-Articular Fracture Surgeries
Abstract: Robotic assistance can bring significant improvements to orthopedic fracture surgery: facilitate more accurate fracture fragment repositioning without open access and obviate problems related to the current minimally invasive fracture surgery techniques by providing a better clinical outcome, reduced recovery time, and health-related costs. This paper presents a new design of the robot-assisted fracture surgery (RAFS) system developed at Bristol Robotics Laboratory, featuring a new robotic architecture, and real-time 3D imaging of the fractured anatomy. The technology presented in this paper focuses on distal femur fractures, but can be adapted to the larger domain of fracture surgeries, improving the state-of-the-art in robot assistance in orthopedics. To demonstrate the enhanced performance of the RAFS system, 10 reductions of a distal femur fracture are performed using the system on a bone model. The experimental results clearly demonstrate the accuracy, effectiveness, and safety of the new RAFS system. The system allows the surgeon to precisely reduce the fractures with a reduction accuracy of 1.15 mm and 1.3Ā°, meeting the clinical requirements for this procedure
Mapping surgeons hand/finger movements to surgical tool motion during conventional microsurgery using machine learning
Purpose: Recent developments in robotics and artificial intelligence (AI) have led to significant advances in healthcare technologies enhancing robot-assisted minimally invasive surgery (RAMIS) in some surgical specialties. However, current humanārobot interfaces lack intuitive teleoperation and cannot mimic surgeonās hand/finger sensing required for fine motion micro-surgeries. These limitations make teleoperated robotic surgery not less suitable for, e.g. cardiac surgery and it can be difficult to learn for established surgeons. We report a pilot study showing an intuitive way of recording and mapping surgeonās gross hand motion and the fine synergic motion during cardiac micro-surgery as a way to enhance future intuitive teleoperation.
Methods: We set to develop a prototype system able to train a Deep Neural Network (DNN) by mapping wrist, hand and surgical tool real-time data acquisition (RTDA) inputs during mock-up heart micro-surgery procedures. The trained network was used to estimate the tools poses from refined hand joint angles. Outputs of the network were surgical tool orientation and jaw angle acquired by an optical motion capture system.
Results: Based on surgeonās feedback during mock micro-surgery, the developed wearable system with light-weight sensors for motion tracking did not interfere with the surgery and instrument handling. The wearable motion tracking system used 12 finger/thumb/wrist joint angle sensors to generate meaningful datasets representing inputs of the DNN network with new hand joint angles added as necessary based on comparing the estimated tool poses against measured tool pose. The DNN architecture was optimized for the highest estimation accuracy and the ability to determine the tool pose with the least mean squared error. This novel approach showed that the surgical instrumentās pose, an essential requirement for teleoperation, can be accurately estimated from recorded surgeonās hand/finger movements with a mean squared error (MSE) less than 0.3%.
Conclusion: We have developed a system to capture fine movements of the surgeonās hand during micro-surgery that could enhance future remote teleoperation of similar surgical tools during micro-surgery. More work is needed to refine this approach and confirm its potential role in teleoperation
Computer assisted rapid nondestructive method for evaluation of meat freshness
U ovom istraživanju je za procenu svežine mesa razvijena tehnika koja koristi nedestruktivnu metodu na bazi optiÄke slike i kompjuterski potpomognutu analizu. Tehnika podrazumeva kombinovanje Opto-magnetne imidžing spektro- skopije i algoritama maÅ”inskog uÄenja kako bi se utvrdila svežina mesa, odnosno vreme skladiÅ”tenja. Akvizicija Opto-magnetnih spektara uzoraka mesa, Äuvanih u frižideru za vreme trajanja eksperimenta, raÄena je nakon 0h, 12h i 24h skladiÅ”tenja i to specijalno razvijenim imidžing sistemom, podržanim odgovarajuÄim kompjuterskim algoritmom za obradu slike. Dobijeni spektri korelisani su sa vremenom skladiÅ”tenja uzoraka i na takvom setu podataka testirano je nekoliko klasifikacionih algoritama maÅ”inskog uÄenja. Najbolji rezultati predikcije, za pileÄe i juneÄe meso, dobijeni su koriÅ”Äenjem 'lenjog' (eng. lazzy) IB1 klasifikatora sa taÄnoÅ”Äu 97.47% za piletinu i 98,23% za junetinu. Kako je metod baziran na detekciji promena stanja vode u tkivima, period svežine mesa odreÄen je na osnovu promena u hidrataciji i aktivnosti vode u mesu.In this study a technique was developed to predict the meat freshness decay by employing a nondestructive visible imaging method and a computer assisted analysis. The technique uses Opto-magnetic imaging spectroscopy and machine learning algorithms for detecting of freshness during storage. The opto-magnetic spectra of meat samples were acquired at 0, 12 and 24 hours of refrigerated storage using specially developed imaging system and computer image processing algorithm. The obtained spectra were related to the storage time of the samples, and several machine learning classification algorithms were tested. The best prediction results of freshness for chicken and beef was achieved using lazy IB1 classifier with the accuracy of 97.47% for chicken, and 98.23% for beef. Since the method is concerned with detecting changes in the state of water in tissues, the freshness decay period was estimated based on changes in meat hydration properties
Head tracking using an optical soft tactile sensing surface
This research proposes a sensor for tracking the motion of a human head via optical tactile sensing. It implements the use of a fibrescope a non-metal alternative to a webcam. Previous works have included robotics grippers to mimic the sensory features of human skin, that used monochrome cameras and depth cameras. Tactile sensing has shown advantages in feedback-based interactions between robots and their environment. The methodology in this paper is utilised to track motion of objects in physical contact with these sensors to replace external camera based motion capture systems. Our immediate application is related to detection of human head motion during radiotherapy procedures. The motion was analysed in two degrees of freedom, respective to the tactile sensor (translational in z-axis, and rotational around y-axis), to produce repeatable and accurate results. The movements were stimulated by a robot arm, which also provided ground truth values from its end-effector. The fibrescope was implemented to ensure the deviceās compatibility with electromagnetic waves. The cameras and the ground truth values were time synchronised using robotics operating systems tools. Image processing methods were compared between grayscale and binary image sequences, followed by motion tracking estimation using deterministic approaches. These included Lukas-Kanade Optical Flow and Simple Blob Detection, by OpenCV. The results showed that the grayscale image processing along with the Lukas-Kanade algorithm for motion tracking can produce better tracking abilities, although further exploration to improve the accuracy is still required
- ā¦