6,594 research outputs found
Accuracy and repeatability of wrist joint angles in boxing using an electromagnetic tracking system
Β© 2019, The Author(s). The hand-wrist region is reported as the most common injury site in boxing. Boxers are at risk due to the amount of wrist motions when impacting training equipment or their opponents, yet we know relatively little about these motions. This paper describes a new method for quantifying wrist motion in boxing using an electromagnetic tracking system. Surrogate testing procedure utilising a polyamide hand and forearm shape, and in vivo testing procedure utilising 29 elite boxers, were used to assess the accuracy and repeatability of the system. 2D kinematic analysis was used to calculate wrist angles using photogrammetry, whilst the data from the electromagnetic tracking system was processed with visual 3D software. The electromagnetic tracking system agreed with the video-based system (paired t tests) in both the surrogate ( 0.9). In the punch testing, for both repeated jab and hook shots, the electromagnetic tracking system showed good reliability (ICCs > 0.8) and substantial reliability (ICCs > 0.6) for flexionβextension and radial-ulnar deviation angles, respectively. The results indicate that wrist kinematics during punching activities can be measured using an electromagnetic tracking system
Projected Augmented Reality to Drive Osteotomy Surgery: Implementation and Comparison With Video See-Through Technology
In recent years, the spreading of visual augmented reality as an effective tool in image-guided surgery, has stimulated the research community to investigate the use of commercial augmented reality headsets a broad range of potential applications. This aroused enthusiasm among clinicians for the potential of augmented reality, but also revealed some technological and human-factor limitations that still hinder its routine adoption in the operating room. In this work, we propose an alternative to head-mounted displays, based on projected augmented reality. Projected augmented reality completely preserves the surgeonβs natural view of the operating field, because it requires no perspective conversion and/or optical mediation. We selected a cranio-maxillofacial surgery application as a benchmark to test the proposed system and compare its accuracy with the one obtained with a video see-through system. The augmented reality overlay accuracy was evaluated by measuring the distance between a virtual osteotomy line and its real counterpart. The experimental tests showed that the accuracy of the two augmented reality modes is similar, with a median error discrepancy of about 0.3 mm for the projected augmented reality mode. Results suggest that projected augmented reality can be a valuable alternative to standard see-through head-mounted displays to support in-situ visualization of medical imaging data as surgical guidance
Image-guided Breast Biopsy of MRI-visible Lesions with a Hand-mounted Motorised Needle Steering Tool
A biopsy is the only diagnostic procedure for accurate histological
confirmation of breast cancer. When sonographic placement is not feasible, a
Magnetic Resonance Imaging(MRI)-guided biopsy is often preferred. The lack of
real-time imaging information and the deformations of the breast make it
challenging to bring the needle precisely towards the tumour detected in
pre-interventional Magnetic Resonance (MR) images. The current manual
MRI-guided biopsy workflow is inaccurate and would benefit from a technique
that allows real-time tracking and localisation of the tumour lesion during
needle insertion. This paper proposes a robotic setup and software architecture
to assist the radiologist in targeting MR-detected suspicious tumours. The
approach benefits from image fusion of preoperative images with intraoperative
optical tracking of markers attached to the patient's skin. A hand-mounted
biopsy device has been constructed with an actuated needle base to drive the
tip toward the desired direction. The steering commands may be provided both by
user input and by computer guidance. The workflow is validated through phantom
experiments. On average, the suspicious breast lesion is targeted with a radius
down to 2.3 mm. The results suggest that robotic systems taking into account
breast deformations have the potentials to tackle this clinical challenge.Comment: Submitted to 2021 International Symposium on Medical Robotics (ISMR
Motion analysis report
Human motion analysis is the task of converting actual human movements into computer readable data. Such movement information may be obtained though active or passive sensing methods. Active methods include physical measuring devices such as goniometers on joints of the body, force plates, and manually operated sensors such as a Cybex dynamometer. Passive sensing de-couples the position measuring device from actual human contact. Passive sensors include Selspot scanning systems (since there is no mechanical connection between the subject's attached LEDs and the infrared sensing cameras), sonic (spark-based) three-dimensional digitizers, Polhemus six-dimensional tracking systems, and image processing systems based on multiple views and photogrammetric calculations
μΈκ° κΈ°κ³ μνΈμμ©μ μν κ°κ±΄νκ³ μ νν μλμ μΆμ κΈ°μ μ°κ΅¬
νμλ
Όλ¬Έ(λ°μ¬) -- μμΈλνκ΅λνμ : 곡과λν κΈ°κ³ν곡곡νλΆ, 2021.8. μ΄λμ€.Hand-based interface is promising for realizing intuitive, natural and accurate human machine interaction (HMI), as the human hand is main source of dexterity in our daily activities.
For this, the thesis begins with the human perception study on the detection threshold of visuo-proprioceptive conflict (i.e., allowable tracking error) with or without cutantoues haptic feedback, and suggests tracking error specification for realistic and fluidic hand-based HMI. The thesis then proceeds to propose a novel wearable hand tracking module, which, to be compatible with the cutaneous haptic devices spewing magnetic noise, opportunistically employ heterogeneous sensors (IMU/compass module and soft sensor) reflecting the anatomical properties of human hand, which is suitable for specific application (i.e., finger-based interaction with finger-tip haptic devices).
This hand tracking module however loses its tracking when interacting with, or being nearby, electrical machines or ferromagnetic materials. For this, the thesis presents its main contribution, a novel visual-inertial skeleton tracking (VIST) framework, that can provide accurate and robust hand (and finger) motion tracking even for many challenging real-world scenarios and environments,
for which the state-of-the-art technologies are known to fail due to their respective fundamental limitations (e.g., severe occlusions for tracking purely with vision sensors; electromagnetic interference for tracking purely with IMUs (inertial measurement units) and compasses; and mechanical contacts for tracking purely with soft sensors).
The proposed VIST framework comprises a sensor glove with multiple IMUs and passive visual markers as well as a head-mounted stereo camera; and a tightly-coupled filtering-based visual-inertial fusion algorithm to estimate the hand/finger motion and auto-calibrate hand/glove-related kinematic parameters simultaneously while taking into account the hand anatomical constraints.
The VIST framework exhibits good tracking accuracy and robustness, affordable material cost, light hardware and software weights, and ruggedness/durability even to permit washing.
Quantitative and qualitative experiments are also performed to validate the advantages and properties of our VIST framework, thereby, clearly demonstrating its potential for real-world applications.μ λμμ κΈ°λ°μΌλ‘ ν μΈν°νμ΄μ€λ μΈκ°-κΈ°κ³ μνΈμμ© λΆμΌμμ μ§κ΄μ±, λͺ°μ
κ°, μ κ΅ν¨μ μ 곡ν΄μ€ μ μμ΄ λ§μ μ£Όλͺ©μ λ°κ³ μκ³ , μ΄λ₯Ό μν΄ κ°μ₯ νμμ μΈ κΈ°μ μ€ νλκ° μ λμμ κ°κ±΄νκ³ μ νν μΆμ κΈ°μ μ΄λ€.
μ΄λ₯Ό μν΄ λ³Έ νμλ
Όλ¬Έμμλ λ¨Όμ μ¬λ μΈμ§μ κ΄μ μμ μ λμ μΆμ μ€μ°¨μ μΈμ§ λ²μλ₯Ό κ·λͺ
νλ€. μ΄ μ€μ°¨ μΈμ§ λ²μλ μλ‘μ΄ μ λμ μΆμ κΈ°μ κ°λ° μ μ€μν μ€κ³ κΈ°μ€μ΄ λ μ μμ΄ μ΄λ₯Ό νΌνμ μ€νμ ν΅ν΄ μ λμ μΌλ‘ λ°νκ³ , νΉν μλ μ΄κ° μ₯λΉκ° μμλ μ΄ μΈμ§ λ²μμ λ³νλ λ°νλ€.
μ΄λ₯Ό ν λλ‘, μ΄κ° νΌλλ°±μ μ£Όλ κ²μ΄ λ€μν μΈκ°-κΈ°κ³ μνΈμμ© λΆμΌμμ λ리 μ°κ΅¬λμ΄ μμΌλ―λ‘, λ¨Όμ μλ μ΄κ° μ₯λΉμ ν¨κ» μ¬μ©ν μ μλ μ λμ μΆμ λͺ¨λμ κ°λ°νλ€.
μ΄ μλ μ΄κ° μ₯λΉλ μκΈ°μ₯ μΈλμ μΌμΌμΌ μ°©μ©ν κΈ°μ μμ νν μ¬μ©λλ μ§μκΈ° μΌμλ₯Ό κ΅λνλλ°, μ΄λ₯Ό μ μ ν μ¬λ μμ ν΄λΆνμ νΉμ±κ³Ό κ΄μ± μΌμ/μ§μκΈ° μΌμ/μννΈ μΌμμ μ μ ν νμ©μ ν΅ν΄ ν΄κ²°νλ€.
μ΄λ₯Ό νμ₯νμ¬ λ³Έ λ
Όλ¬Έμμλ, μ΄κ° μ₯λΉ μ°©μ© μ λΏ μλλΌ λͺ¨λ μ₯λΉ μ°©μ© / νκ²½ / 물체μμ μνΈμμ© μμλ μ¬μ© κ°λ₯ν μλ‘μ΄ μ λμ μΆμ κΈ°μ μ μ μνλ€.
κΈ°μ‘΄μ μ λμ μΆμ κΈ°μ λ€μ κ°λ¦Ό νμ (μμ κΈ°λ° κΈ°μ ), μ§μκΈ° μΈλ (κ΄μ±/μ§μκΈ° μΌμ κΈ°λ° κΈ°μ ), 물체μμ μ μ΄ (μννΈ μΌμ κΈ°λ° κΈ°μ ) λ±μΌλ‘ μΈν΄ μ νλ νκ²½μμ λ°μ μ¬μ©νμ§ λͺ»νλ€.
μ΄λ₯Ό μν΄ λ§μ λ¬Έμ λ₯Ό μΌμΌν€λ μ§μκΈ° μΌμ μμ΄ μ보μ μΈ νΉμ±μ μ§λλ κ΄μ± μΌμμ μμ μΌμλ₯Ό μ΅ν©νκ³ , μ΄λ μμ 곡κ°μ λ€ μμ λμ μμ§μμ κ°λ μ λμμ μΆμ νκΈ° μν΄ λ€μμ ꡬλΆλμ§ μλ λ§μ»€λ€μ μ¬μ©νλ€.
μ΄ λ§μ»€μ κ΅¬λΆ κ³Όμ (correspondence search)λ₯Ό μν΄ κΈ°μ‘΄μ μ½κ²°ν© (loosely-coupled) κΈ°λ°μ΄ μλ κ°κ²°ν© (tightly-coupled κΈ°λ° μΌμ μ΅ν© κΈ°μ μ μ μνκ³ , μ΄λ₯Ό ν΅ν΄ μ§μκΈ° μΌμ μμ΄ μ νν μ λμμ΄ κ°λ₯ν λΏ μλλΌ μ°©μ©ν μΌμλ€μ μ νμ±/νΈμμ±μ λ¬Έμ λ₯Ό μΌμΌν€λ μΌμ λΆμ°© μ€μ°¨ / μ¬μ©μμ μ λͺ¨μ λ±μ μλμΌλ‘ μ νν 보μ νλ€.
μ΄ μ μλ μμ-κ΄μ± μΌμ μ΅ν© κΈ°μ (Visual-Inertial Skeleton Tracking (VIST)) μ λ°μ΄λ μ±λ₯κ³Ό κ°κ±΄μ±μ΄ λ€μν μ λ/μ μ± μ€νμ ν΅ν΄ κ²μ¦λμκ³ , μ΄λ VISTμ λ€μν μΌμνκ²½μμ κΈ°μ‘΄ μμ€ν
μ΄ κ΅¬ννμ§ λͺ»νλ μ λμ μΆμ μ κ°λ₯μΌ ν¨μΌλ‘μ¨, λ§μ μΈκ°-κΈ°κ³ μνΈμμ© λΆμΌμμμ κ°λ₯μ±μ 보μ¬μ€λ€.1 Introduction 1
1.1. Motivation 1
1.2. Related Work 5
1.3. Contribution 12
2 Detection Threshold of Hand Tracking Error 16
2.1. Motivation 16
2.2. Experimental Environment 20
2.2.1. Hardware Setup 21
2.2.2. Virtual Environment Rendering 23
2.2.3. HMD Calibration 23
2.3. Identifying the Detection Threshold of Tracking Error 26
2.3.1. Experimental Setup 27
2.3.2. Procedure 27
2.3.3. Experimental Result 31
2.4. Enlarging the Detection Threshold of Tracking Error by Haptic Feedback 31
2.4.1. Experimental Setup 31
2.4.2. Procedure 32
2.4.3. Experimental Result 34
2.5. Discussion 34
3 Wearable Finger Tracking Module for Haptic Interaction 38
3.1. Motivation 38
3.2. Development of Finger Tracking Module 42
3.2.1. Hardware Setup 42
3.2.2. Tracking algorithm 45
3.2.3. Calibration method 48
3.3. Evaluation for VR Haptic Interaction Task 50
3.3.1. Quantitative evaluation of FTM 50
3.3.2. Implementation of Wearable Cutaneous Haptic Interface
51
3.3.3. Usability evaluation for VR peg-in-hole task 53
3.4. Discussion 57
4 Visual-Inertial Skeleton Tracking for Human Hand 59
4.1. Motivation 59
4.2. Hardware Setup and Hand Models 62
4.2.1. Human Hand Model 62
4.2.2. Wearable Sensor Glove 62
4.2.3. Stereo Camera 66
4.3. Visual Information Extraction 66
4.3.1. Marker Detection in Raw Images 68
4.3.2. Cost Function for Point Matching 68
4.3.3. Left-Right Stereo Matching 69
4.4. IMU-Aided Correspondence Search 72
4.5. Filtering-based Visual-Inertial Sensor Fusion 76
4.5.1. EKF States for Hand Tracking and Auto-Calibration 78
4.5.2. Prediction with IMU Information 79
4.5.3. Correction with Visual Information 82
4.5.4. Correction with Anatomical Constraints 84
4.6. Quantitative Evaluation for Free Hand Motion 87
4.6.1. Experimental Setup 87
4.6.2. Procedure 88
4.6.3. Experimental Result 90
4.7. Quantitative and Comparative Evaluation for Challenging Hand Motion 95
4.7.1. Experimental Setup 95
4.7.2. Procedure 96
4.7.3. Experimental Result 98
4.7.4. Performance Comparison with Existing Methods for Challenging Hand Motion 101
4.8. Qualitative Evaluation for Real-World Scenarios 105
4.8.1. Visually Complex Background 105
4.8.2. Object Interaction 106
4.8.3. Wearing Fingertip Cutaneous Haptic Devices 109
4.8.4. Outdoor Environment 111
4.9. Discussion 112
5 Conclusion 116
References 124
Abstract (in Korean) 139
Acknowledgment 141λ°
Optimised Calibration, Registration and Tracking for Image Enhanced Surgical Navigation in ENT Operations
EThOS - Electronic Theses Online ServiceGBUnited Kingdo
Towards markerless orthopaedic navigation with intuitive Optical See-through Head-mounted displays
The potential of image-guided orthopaedic navigation to improve surgical outcomes has been well-recognised during the last two decades. According to the tracked pose of target bone, the anatomical information and preoperative plans are updated and displayed to surgeons, so that they can follow the guidance to reach the goal with higher accuracy, efficiency and reproducibility. Despite their success, current orthopaedic navigation systems have two main limitations: for target tracking, artificial markers have to be drilled into the bone and calibrated manually to the bone, which introduces the risk of additional harm to patients and increases operating complexity; for guidance visualisation, surgeons have to shift their attention from the patient to an external 2D monitor, which is disruptive and can be mentally stressful.
Motivated by these limitations, this thesis explores the development of an intuitive, compact and reliable navigation system for orthopaedic surgery. To this end, conventional marker-based tracking is replaced by a novel markerless tracking algorithm, and the 2D display is replaced by a 3D holographic Optical see-through (OST) Head-mounted display (HMD) precisely calibrated to a user's perspective.
Our markerless tracking, facilitated by a commercial RGBD camera, is achieved through deep learning-based bone segmentation followed by real-time pose registration. For robust segmentation, a new network is designed and efficiently augmented by a synthetic dataset. Our segmentation network outperforms the state-of-the-art regarding occlusion-robustness, device-agnostic behaviour, and target generalisability. For reliable pose registration, a novel Bounded Iterative Closest Point (BICP) workflow is proposed. The improved markerless tracking can achieve a clinically acceptable error of 0.95 deg and 2.17 mm according to a phantom test.
OST displays allow ubiquitous enrichment of perceived real world with contextually blended virtual aids through semi-transparent glasses. They have been recognised as a suitable visual tool for surgical assistance, since they do not hinder the surgeon's natural eyesight and require no attention shift or perspective conversion. The OST calibration is crucial to ensure locational-coherent surgical guidance.
Current calibration methods are either human error-prone or hardly applicable to commercial devices. To this end, we propose an offline camera-based calibration method that is highly accurate yet easy to implement in commercial products, and an online alignment-based refinement that is user-centric and robust against user error. The proposed methods are proven to be superior to other similar State-of-
the-art (SOTA)s regarding calibration convenience and display accuracy.
Motivated by the ambition to develop the world's first markerless OST navigation system, we integrated the developed markerless tracking and calibration scheme into a complete navigation workflow designed for femur drilling tasks during knee replacement surgery. We verify the usability of our designed OST system with an experienced orthopaedic surgeon by a cadaver study. Our test validates the potential of the proposed markerless navigation system for surgical assistance, although further improvement is required for clinical acceptance.Open Acces
- β¦