6,594 research outputs found

    Accuracy and repeatability of wrist joint angles in boxing using an electromagnetic tracking system

    Get PDF
    Β© 2019, The Author(s). The hand-wrist region is reported as the most common injury site in boxing. Boxers are at risk due to the amount of wrist motions when impacting training equipment or their opponents, yet we know relatively little about these motions. This paper describes a new method for quantifying wrist motion in boxing using an electromagnetic tracking system. Surrogate testing procedure utilising a polyamide hand and forearm shape, and in vivo testing procedure utilising 29 elite boxers, were used to assess the accuracy and repeatability of the system. 2D kinematic analysis was used to calculate wrist angles using photogrammetry, whilst the data from the electromagnetic tracking system was processed with visual 3D software. The electromagnetic tracking system agreed with the video-based system (paired t tests) in both the surrogate ( 0.9). In the punch testing, for both repeated jab and hook shots, the electromagnetic tracking system showed good reliability (ICCs > 0.8) and substantial reliability (ICCs > 0.6) for flexion–extension and radial-ulnar deviation angles, respectively. The results indicate that wrist kinematics during punching activities can be measured using an electromagnetic tracking system

    Projected Augmented Reality to Drive Osteotomy Surgery: Implementation and Comparison With Video See-Through Technology

    Get PDF
    In recent years, the spreading of visual augmented reality as an effective tool in image-guided surgery, has stimulated the research community to investigate the use of commercial augmented reality headsets a broad range of potential applications. This aroused enthusiasm among clinicians for the potential of augmented reality, but also revealed some technological and human-factor limitations that still hinder its routine adoption in the operating room. In this work, we propose an alternative to head-mounted displays, based on projected augmented reality. Projected augmented reality completely preserves the surgeon’s natural view of the operating field, because it requires no perspective conversion and/or optical mediation. We selected a cranio-maxillofacial surgery application as a benchmark to test the proposed system and compare its accuracy with the one obtained with a video see-through system. The augmented reality overlay accuracy was evaluated by measuring the distance between a virtual osteotomy line and its real counterpart. The experimental tests showed that the accuracy of the two augmented reality modes is similar, with a median error discrepancy of about 0.3 mm for the projected augmented reality mode. Results suggest that projected augmented reality can be a valuable alternative to standard see-through head-mounted displays to support in-situ visualization of medical imaging data as surgical guidance

    Image-guided Breast Biopsy of MRI-visible Lesions with a Hand-mounted Motorised Needle Steering Tool

    Get PDF
    A biopsy is the only diagnostic procedure for accurate histological confirmation of breast cancer. When sonographic placement is not feasible, a Magnetic Resonance Imaging(MRI)-guided biopsy is often preferred. The lack of real-time imaging information and the deformations of the breast make it challenging to bring the needle precisely towards the tumour detected in pre-interventional Magnetic Resonance (MR) images. The current manual MRI-guided biopsy workflow is inaccurate and would benefit from a technique that allows real-time tracking and localisation of the tumour lesion during needle insertion. This paper proposes a robotic setup and software architecture to assist the radiologist in targeting MR-detected suspicious tumours. The approach benefits from image fusion of preoperative images with intraoperative optical tracking of markers attached to the patient's skin. A hand-mounted biopsy device has been constructed with an actuated needle base to drive the tip toward the desired direction. The steering commands may be provided both by user input and by computer guidance. The workflow is validated through phantom experiments. On average, the suspicious breast lesion is targeted with a radius down to 2.3 mm. The results suggest that robotic systems taking into account breast deformations have the potentials to tackle this clinical challenge.Comment: Submitted to 2021 International Symposium on Medical Robotics (ISMR

    Motion analysis report

    Get PDF
    Human motion analysis is the task of converting actual human movements into computer readable data. Such movement information may be obtained though active or passive sensing methods. Active methods include physical measuring devices such as goniometers on joints of the body, force plates, and manually operated sensors such as a Cybex dynamometer. Passive sensing de-couples the position measuring device from actual human contact. Passive sensors include Selspot scanning systems (since there is no mechanical connection between the subject's attached LEDs and the infrared sensing cameras), sonic (spark-based) three-dimensional digitizers, Polhemus six-dimensional tracking systems, and image processing systems based on multiple views and photogrammetric calculations

    인간 기계 μƒν˜Έμž‘μš©μ„ μœ„ν•œ κ°•κ±΄ν•˜κ³  μ •ν™•ν•œ μ†λ™μž‘ 좔적 기술 연ꡬ

    Get PDF
    ν•™μœ„λ…Όλ¬Έ(박사) -- μ„œμšΈλŒ€ν•™κ΅λŒ€ν•™μ› : κ³΅κ³ΌλŒ€ν•™ 기계항곡곡학뢀, 2021.8. 이동쀀.Hand-based interface is promising for realizing intuitive, natural and accurate human machine interaction (HMI), as the human hand is main source of dexterity in our daily activities. For this, the thesis begins with the human perception study on the detection threshold of visuo-proprioceptive conflict (i.e., allowable tracking error) with or without cutantoues haptic feedback, and suggests tracking error specification for realistic and fluidic hand-based HMI. The thesis then proceeds to propose a novel wearable hand tracking module, which, to be compatible with the cutaneous haptic devices spewing magnetic noise, opportunistically employ heterogeneous sensors (IMU/compass module and soft sensor) reflecting the anatomical properties of human hand, which is suitable for specific application (i.e., finger-based interaction with finger-tip haptic devices). This hand tracking module however loses its tracking when interacting with, or being nearby, electrical machines or ferromagnetic materials. For this, the thesis presents its main contribution, a novel visual-inertial skeleton tracking (VIST) framework, that can provide accurate and robust hand (and finger) motion tracking even for many challenging real-world scenarios and environments, for which the state-of-the-art technologies are known to fail due to their respective fundamental limitations (e.g., severe occlusions for tracking purely with vision sensors; electromagnetic interference for tracking purely with IMUs (inertial measurement units) and compasses; and mechanical contacts for tracking purely with soft sensors). The proposed VIST framework comprises a sensor glove with multiple IMUs and passive visual markers as well as a head-mounted stereo camera; and a tightly-coupled filtering-based visual-inertial fusion algorithm to estimate the hand/finger motion and auto-calibrate hand/glove-related kinematic parameters simultaneously while taking into account the hand anatomical constraints. The VIST framework exhibits good tracking accuracy and robustness, affordable material cost, light hardware and software weights, and ruggedness/durability even to permit washing. Quantitative and qualitative experiments are also performed to validate the advantages and properties of our VIST framework, thereby, clearly demonstrating its potential for real-world applications.손 λ™μž‘μ„ 기반으둜 ν•œ μΈν„°νŽ˜μ΄μŠ€λŠ” 인간-기계 μƒν˜Έμž‘μš© λΆ„μ•Όμ—μ„œ 직관성, λͺ°μž…감, 정ꡐ함을 μ œκ³΅ν•΄μ€„ 수 μžˆμ–΄ λ§Žμ€ μ£Όλͺ©μ„ λ°›κ³  있고, 이λ₯Ό μœ„ν•΄ κ°€μž₯ ν•„μˆ˜μ μΈ 기술 쀑 ν•˜λ‚˜κ°€ 손 λ™μž‘μ˜ κ°•κ±΄ν•˜κ³  μ •ν™•ν•œ 좔적 기술 이닀. 이λ₯Ό μœ„ν•΄ λ³Έ ν•™μœ„λ…Όλ¬Έμ—μ„œλŠ” λ¨Όμ € μ‚¬λžŒ μΈμ§€μ˜ κ΄€μ μ—μ„œ 손 λ™μž‘ 좔적 였차의 인지 λ²”μœ„λ₯Ό 규λͺ…ν•œλ‹€. 이 였차 인지 λ²”μœ„λŠ” μƒˆλ‘œμš΄ 손 λ™μž‘ 좔적 기술 개발 μ‹œ μ€‘μš”ν•œ 섀계 기쀀이 될 수 μžˆμ–΄ 이λ₯Ό ν”Όν—˜μž μ‹€ν—˜μ„ 톡해 μ •λŸ‰μ μœΌλ‘œ 밝히고, 특히 손끝 촉각 μž₯λΉ„κ°€ μžˆμ„λ•Œ 이 인지 λ²”μœ„μ˜ 변화도 λ°νžŒλ‹€. 이λ₯Ό ν† λŒ€λ‘œ, 촉각 ν”Όλ“œλ°±μ„ μ£ΌλŠ” 것이 λ‹€μ–‘ν•œ 인간-기계 μƒν˜Έμž‘μš© λΆ„μ•Όμ—μ„œ 널리 μ—°κ΅¬λ˜μ–΄ μ™”μœΌλ―€λ‘œ, λ¨Όμ € 손끝 촉각 μž₯비와 ν•¨κ»˜ μ‚¬μš©ν•  수 μžˆλŠ” 손 λ™μž‘ 좔적 λͺ¨λ“ˆμ„ κ°œλ°œν•œλ‹€. 이 손끝 촉각 μž₯λΉ„λŠ” 자기μž₯ μ™Έλž€μ„ 일으켜 μ°©μš©ν˜• κΈ°μˆ μ—μ„œ ν”νžˆ μ‚¬μš©λ˜λŠ” μ§€μžκΈ° μ„Όμ„œλ₯Ό κ΅λž€ν•˜λŠ”λ°, 이λ₯Ό μ μ ˆν•œ μ‚¬λžŒ μ†μ˜ 해뢀학적 νŠΉμ„±κ³Ό κ΄€μ„± μ„Όμ„œ/μ§€μžκΈ° μ„Όμ„œ/μ†Œν”„νŠΈ μ„Όμ„œμ˜ μ μ ˆν•œ ν™œμš©μ„ 톡해 ν•΄κ²°ν•œλ‹€. 이λ₯Ό ν™•μž₯ν•˜μ—¬ λ³Έ λ…Όλ¬Έμ—μ„œλŠ”, 촉각 μž₯λΉ„ 착용 μ‹œ 뿐 μ•„λ‹ˆλΌ λͺ¨λ“  μž₯λΉ„ 착용 / ν™˜κ²½ / λ¬Όμ²΄μ™€μ˜ μƒν˜Έμž‘μš© μ‹œμ—λ„ μ‚¬μš© κ°€λŠ₯ν•œ μƒˆλ‘œμš΄ 손 λ™μž‘ 좔적 κΈ°μˆ μ„ μ œμ•ˆν•œλ‹€. 기쑴의 손 λ™μž‘ 좔적 κΈ°μˆ λ“€μ€ κ°€λ¦Ό ν˜„μƒ (μ˜μƒ 기반 기술), μ§€μžκΈ° μ™Έλž€ (κ΄€μ„±/μ§€μžκΈ° μ„Όμ„œ 기반 기술), λ¬Όμ²΄μ™€μ˜ 접촉 (μ†Œν”„νŠΈ μ„Όμ„œ 기반 기술) λ“±μœΌλ‘œ 인해 μ œν•œλœ ν™˜κ²½μ—μ„œ 밖에 μ‚¬μš©ν•˜μ§€ λͺ»ν•œλ‹€. 이λ₯Ό μœ„ν•΄ λ§Žμ€ 문제λ₯Ό μΌμœΌν‚€λŠ” μ§€μžκΈ° μ„Όμ„œ 없이 상보적인 νŠΉμ„±μ„ μ§€λ‹ˆλŠ” κ΄€μ„± μ„Όμ„œμ™€ μ˜μƒ μ„Όμ„œλ₯Ό μœ΅ν•©ν•˜κ³ , μ΄λ•Œ μž‘μ€ 곡간에 λ‹€ μžμœ λ„μ˜ μ›€μ§μž„μ„ κ°–λŠ” 손 λ™μž‘μ„ μΆ”μ ν•˜κΈ° μœ„ν•΄ λ‹€μˆ˜μ˜ κ΅¬λΆ„λ˜μ§€ μ•ŠλŠ” λ§ˆμ»€λ“€μ„ μ‚¬μš©ν•œλ‹€. 이 마컀의 ꡬ뢄 κ³Όμ • (correspondence search)λ₯Ό μœ„ν•΄ 기쑴의 μ•½κ²°ν•© (loosely-coupled) 기반이 μ•„λ‹Œ κ°•κ²°ν•© (tightly-coupled 기반 μ„Όμ„œ μœ΅ν•© κΈ°μˆ μ„ μ œμ•ˆν•˜κ³ , 이λ₯Ό 톡해 μ§€μžκΈ° μ„Όμ„œ 없이 μ •ν™•ν•œ 손 λ™μž‘μ΄ κ°€λŠ₯ν•  뿐 μ•„λ‹ˆλΌ μ°©μš©ν˜• μ„Όμ„œλ“€μ˜ μ •ν™•μ„±/νŽΈμ˜μ„±μ— 문제λ₯Ό μΌμœΌν‚€λ˜ μ„Όμ„œ λΆ€μ°© 였차 / μ‚¬μš©μžμ˜ 손 λͺ¨μ–‘ 등을 μžλ™μœΌλ‘œ μ •ν™•νžˆ λ³΄μ •ν•œλ‹€. 이 μ œμ•ˆλœ μ˜μƒ-κ΄€μ„± μ„Όμ„œ μœ΅ν•© 기술 (Visual-Inertial Skeleton Tracking (VIST)) 의 λ›°μ–΄λ‚œ μ„±λŠ₯κ³Ό 강건성이 λ‹€μ–‘ν•œ μ •λŸ‰/μ •μ„± μ‹€ν—˜μ„ 톡해 κ²€μ¦λ˜μ—ˆκ³ , μ΄λŠ” VIST의 λ‹€μ–‘ν•œ μΌμƒν™˜κ²½μ—μ„œ κΈ°μ‘΄ μ‹œμŠ€ν…œμ΄ κ΅¬ν˜„ν•˜μ§€ λͺ»ν•˜λ˜ 손 λ™μž‘ 좔적을 κ°€λŠ₯μΌ€ ν•¨μœΌλ‘œμ¨, λ§Žμ€ 인간-기계 μƒν˜Έμž‘μš© λΆ„μ•Όμ—μ„œμ˜ κ°€λŠ₯성을 보여쀀닀.1 Introduction 1 1.1. Motivation 1 1.2. Related Work 5 1.3. Contribution 12 2 Detection Threshold of Hand Tracking Error 16 2.1. Motivation 16 2.2. Experimental Environment 20 2.2.1. Hardware Setup 21 2.2.2. Virtual Environment Rendering 23 2.2.3. HMD Calibration 23 2.3. Identifying the Detection Threshold of Tracking Error 26 2.3.1. Experimental Setup 27 2.3.2. Procedure 27 2.3.3. Experimental Result 31 2.4. Enlarging the Detection Threshold of Tracking Error by Haptic Feedback 31 2.4.1. Experimental Setup 31 2.4.2. Procedure 32 2.4.3. Experimental Result 34 2.5. Discussion 34 3 Wearable Finger Tracking Module for Haptic Interaction 38 3.1. Motivation 38 3.2. Development of Finger Tracking Module 42 3.2.1. Hardware Setup 42 3.2.2. Tracking algorithm 45 3.2.3. Calibration method 48 3.3. Evaluation for VR Haptic Interaction Task 50 3.3.1. Quantitative evaluation of FTM 50 3.3.2. Implementation of Wearable Cutaneous Haptic Interface 51 3.3.3. Usability evaluation for VR peg-in-hole task 53 3.4. Discussion 57 4 Visual-Inertial Skeleton Tracking for Human Hand 59 4.1. Motivation 59 4.2. Hardware Setup and Hand Models 62 4.2.1. Human Hand Model 62 4.2.2. Wearable Sensor Glove 62 4.2.3. Stereo Camera 66 4.3. Visual Information Extraction 66 4.3.1. Marker Detection in Raw Images 68 4.3.2. Cost Function for Point Matching 68 4.3.3. Left-Right Stereo Matching 69 4.4. IMU-Aided Correspondence Search 72 4.5. Filtering-based Visual-Inertial Sensor Fusion 76 4.5.1. EKF States for Hand Tracking and Auto-Calibration 78 4.5.2. Prediction with IMU Information 79 4.5.3. Correction with Visual Information 82 4.5.4. Correction with Anatomical Constraints 84 4.6. Quantitative Evaluation for Free Hand Motion 87 4.6.1. Experimental Setup 87 4.6.2. Procedure 88 4.6.3. Experimental Result 90 4.7. Quantitative and Comparative Evaluation for Challenging Hand Motion 95 4.7.1. Experimental Setup 95 4.7.2. Procedure 96 4.7.3. Experimental Result 98 4.7.4. Performance Comparison with Existing Methods for Challenging Hand Motion 101 4.8. Qualitative Evaluation for Real-World Scenarios 105 4.8.1. Visually Complex Background 105 4.8.2. Object Interaction 106 4.8.3. Wearing Fingertip Cutaneous Haptic Devices 109 4.8.4. Outdoor Environment 111 4.9. Discussion 112 5 Conclusion 116 References 124 Abstract (in Korean) 139 Acknowledgment 141λ°•

    Optimised Calibration, Registration and Tracking for Image Enhanced Surgical Navigation in ENT Operations

    Get PDF
    EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Towards markerless orthopaedic navigation with intuitive Optical See-through Head-mounted displays

    Get PDF
    The potential of image-guided orthopaedic navigation to improve surgical outcomes has been well-recognised during the last two decades. According to the tracked pose of target bone, the anatomical information and preoperative plans are updated and displayed to surgeons, so that they can follow the guidance to reach the goal with higher accuracy, efficiency and reproducibility. Despite their success, current orthopaedic navigation systems have two main limitations: for target tracking, artificial markers have to be drilled into the bone and calibrated manually to the bone, which introduces the risk of additional harm to patients and increases operating complexity; for guidance visualisation, surgeons have to shift their attention from the patient to an external 2D monitor, which is disruptive and can be mentally stressful. Motivated by these limitations, this thesis explores the development of an intuitive, compact and reliable navigation system for orthopaedic surgery. To this end, conventional marker-based tracking is replaced by a novel markerless tracking algorithm, and the 2D display is replaced by a 3D holographic Optical see-through (OST) Head-mounted display (HMD) precisely calibrated to a user's perspective. Our markerless tracking, facilitated by a commercial RGBD camera, is achieved through deep learning-based bone segmentation followed by real-time pose registration. For robust segmentation, a new network is designed and efficiently augmented by a synthetic dataset. Our segmentation network outperforms the state-of-the-art regarding occlusion-robustness, device-agnostic behaviour, and target generalisability. For reliable pose registration, a novel Bounded Iterative Closest Point (BICP) workflow is proposed. The improved markerless tracking can achieve a clinically acceptable error of 0.95 deg and 2.17 mm according to a phantom test. OST displays allow ubiquitous enrichment of perceived real world with contextually blended virtual aids through semi-transparent glasses. They have been recognised as a suitable visual tool for surgical assistance, since they do not hinder the surgeon's natural eyesight and require no attention shift or perspective conversion. The OST calibration is crucial to ensure locational-coherent surgical guidance. Current calibration methods are either human error-prone or hardly applicable to commercial devices. To this end, we propose an offline camera-based calibration method that is highly accurate yet easy to implement in commercial products, and an online alignment-based refinement that is user-centric and robust against user error. The proposed methods are proven to be superior to other similar State-of- the-art (SOTA)s regarding calibration convenience and display accuracy. Motivated by the ambition to develop the world's first markerless OST navigation system, we integrated the developed markerless tracking and calibration scheme into a complete navigation workflow designed for femur drilling tasks during knee replacement surgery. We verify the usability of our designed OST system with an experienced orthopaedic surgeon by a cadaver study. Our test validates the potential of the proposed markerless navigation system for surgical assistance, although further improvement is required for clinical acceptance.Open Acces
    • …
    corecore