232 research outputs found

    Design and Evaluation of a Contact-Free Interface for Minimally Invasive Robotics Assisted Surgery

    Get PDF
    Robotic-assisted minimally invasive surgery (RAMIS) is becoming increasingly more common for many surgical procedures. These minimally invasive techniques offer the benefit of reduced patient recovery time, mortality and scarring compared to traditional open surgery. Teleoperated procedures have the added advantage of increased visualization, and enhanced accuracy for the surgeon through tremor filtering and scaling down hand motions. There are however still limitations in these techniques preventing the widespread growth of the technology. In RAMIS, the surgeon is limited in their movement by the operating console or master device, and the cost of robotic surgery is often too high to justify for many procedures. Sterility issues arise as well, as the surgeon must be in contact with the master device, preventing a smooth transition between traditional and robotic modes of surgery. This thesis outlines the design and analysis of a novel method of interaction with the da Vinci Surgical Robot. Using the da Vinci Research Kit (DVRK), an open source research platform for the da Vinci robot, an interface was developed for controlling the robotic arms with the Leap Motion Controller. This small device uses infrared LEDs and two cameras to detect the 3D positions of the hand and fingers. This data from the hands is mapped to the da Vinci surgical tools in real time, providing the surgeon with an intuitive method of controlling the instruments. An analysis of the tracking workspace is provided, to give a solution to occlusion issues. Multiple sensors are fused together in order to increase the range of trackable motion over a single sensor. Additional work involves replacing the current viewing screen with a virtual reality (VR) headset (Oculus Rift), to provide the surgeon with a stereoscopic 3D view of the surgical site without the need for a large monitor. The headset also provides the user with a more intuitive and natural method of positioning the camera during surgery, using the natural motions of the head. The large master console of the da Vinci system has been replaced with an inexpensive vision based tracking system, and VR headset, allowing the surgeon to operate the da Vinci Surgical Robot with more natural movements for the user. A preliminary evaluation of the system is provided, with recommendations for future work

    Human to robot hand motion mapping methods: review and classification

    Get PDF
    In this article, the variety of approaches proposed in literature to address the problem of mapping human to robot hand motions are summarized and discussed. We particularly attempt to organize under macro-categories the great quantity of presented methods, that are often difficult to be seen from a general point of view due to different fields of application, specific use of algorithms, terminology and declared goals of the mappings. Firstly, a brief historical overview is reported, in order to provide a look on the emergence of the human to robot hand mapping problem as a both conceptual and analytical challenge that is still open nowadays. Thereafter, the survey mainly focuses on a classification of modern mapping methods under six categories: direct joint, direct Cartesian, taskoriented, dimensionality reduction based, pose recognition based and hybrid mappings. For each of these categories, the general view that associates the related reported studies is provided, and representative references are highlighted. Finally, a concluding discussion along with the authorsโ€™ point of view regarding future desirable trends are reported.This work was supported in part by the European Commissionโ€™s Horizon 2020 Framework Programme with the project REMODEL under Grant 870133 and in part by the Spanish Government under Grant PID2020-114819GB-I00.Peer ReviewedPostprint (published version

    Bibliometric analysis on Hand Gesture Controlled Robot

    Get PDF
    This paper discusses about the survey and bibliometric analysis of hand gesture-controlled robot using Scopus database in analyzing the research by area, influential authors, countries, institutions, and funding agencies. The 293 documents are extracted from the year 2016 till 6th March 2021 from the database. Bibliometric analysis is the statistical analysis of the research published as articles, conference papers, and reviews, which helps in understanding the impact of publication in the research domain globally. The visualization analysis is done with open-source tools namely GPS Visualizer, Gephi, VOS viewer, and ScienceScape. The visualization aids in a quick and clear understanding of the different perspective as mentioned above in a particular research domain search

    ์ธ๊ฐ„ ๊ธฐ๊ณ„ ์ƒํ˜ธ์ž‘์šฉ์„ ์œ„ํ•œ ๊ฐ•๊ฑดํ•˜๊ณ  ์ •ํ™•ํ•œ ์†๋™์ž‘ ์ถ”์  ๊ธฐ์ˆ  ์—ฐ๊ตฌ

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ(๋ฐ•์‚ฌ) -- ์„œ์šธ๋Œ€ํ•™๊ต๋Œ€ํ•™์› : ๊ณต๊ณผ๋Œ€ํ•™ ๊ธฐ๊ณ„ํ•ญ๊ณต๊ณตํ•™๋ถ€, 2021.8. ์ด๋™์ค€.Hand-based interface is promising for realizing intuitive, natural and accurate human machine interaction (HMI), as the human hand is main source of dexterity in our daily activities. For this, the thesis begins with the human perception study on the detection threshold of visuo-proprioceptive conflict (i.e., allowable tracking error) with or without cutantoues haptic feedback, and suggests tracking error specification for realistic and fluidic hand-based HMI. The thesis then proceeds to propose a novel wearable hand tracking module, which, to be compatible with the cutaneous haptic devices spewing magnetic noise, opportunistically employ heterogeneous sensors (IMU/compass module and soft sensor) reflecting the anatomical properties of human hand, which is suitable for specific application (i.e., finger-based interaction with finger-tip haptic devices). This hand tracking module however loses its tracking when interacting with, or being nearby, electrical machines or ferromagnetic materials. For this, the thesis presents its main contribution, a novel visual-inertial skeleton tracking (VIST) framework, that can provide accurate and robust hand (and finger) motion tracking even for many challenging real-world scenarios and environments, for which the state-of-the-art technologies are known to fail due to their respective fundamental limitations (e.g., severe occlusions for tracking purely with vision sensors; electromagnetic interference for tracking purely with IMUs (inertial measurement units) and compasses; and mechanical contacts for tracking purely with soft sensors). The proposed VIST framework comprises a sensor glove with multiple IMUs and passive visual markers as well as a head-mounted stereo camera; and a tightly-coupled filtering-based visual-inertial fusion algorithm to estimate the hand/finger motion and auto-calibrate hand/glove-related kinematic parameters simultaneously while taking into account the hand anatomical constraints. The VIST framework exhibits good tracking accuracy and robustness, affordable material cost, light hardware and software weights, and ruggedness/durability even to permit washing. Quantitative and qualitative experiments are also performed to validate the advantages and properties of our VIST framework, thereby, clearly demonstrating its potential for real-world applications.์† ๋™์ž‘์„ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•œ ์ธํ„ฐํŽ˜์ด์Šค๋Š” ์ธ๊ฐ„-๊ธฐ๊ณ„ ์ƒํ˜ธ์ž‘์šฉ ๋ถ„์•ผ์—์„œ ์ง๊ด€์„ฑ, ๋ชฐ์ž…๊ฐ, ์ •๊ตํ•จ์„ ์ œ๊ณตํ•ด์ค„ ์ˆ˜ ์žˆ์–ด ๋งŽ์€ ์ฃผ๋ชฉ์„ ๋ฐ›๊ณ  ์žˆ๊ณ , ์ด๋ฅผ ์œ„ํ•ด ๊ฐ€์žฅ ํ•„์ˆ˜์ ์ธ ๊ธฐ์ˆ  ์ค‘ ํ•˜๋‚˜๊ฐ€ ์† ๋™์ž‘์˜ ๊ฐ•๊ฑดํ•˜๊ณ  ์ •ํ™•ํ•œ ์ถ”์  ๊ธฐ์ˆ  ์ด๋‹ค. ์ด๋ฅผ ์œ„ํ•ด ๋ณธ ํ•™์œ„๋…ผ๋ฌธ์—์„œ๋Š” ๋จผ์ € ์‚ฌ๋žŒ ์ธ์ง€์˜ ๊ด€์ ์—์„œ ์† ๋™์ž‘ ์ถ”์  ์˜ค์ฐจ์˜ ์ธ์ง€ ๋ฒ”์œ„๋ฅผ ๊ทœ๋ช…ํ•œ๋‹ค. ์ด ์˜ค์ฐจ ์ธ์ง€ ๋ฒ”์œ„๋Š” ์ƒˆ๋กœ์šด ์† ๋™์ž‘ ์ถ”์  ๊ธฐ์ˆ  ๊ฐœ๋ฐœ ์‹œ ์ค‘์š”ํ•œ ์„ค๊ณ„ ๊ธฐ์ค€์ด ๋  ์ˆ˜ ์žˆ์–ด ์ด๋ฅผ ํ”ผํ—˜์ž ์‹คํ—˜์„ ํ†ตํ•ด ์ •๋Ÿ‰์ ์œผ๋กœ ๋ฐํžˆ๊ณ , ํŠนํžˆ ์†๋ ์ด‰๊ฐ ์žฅ๋น„๊ฐ€ ์žˆ์„๋•Œ ์ด ์ธ์ง€ ๋ฒ”์œ„์˜ ๋ณ€ํ™”๋„ ๋ฐํžŒ๋‹ค. ์ด๋ฅผ ํ† ๋Œ€๋กœ, ์ด‰๊ฐ ํ”ผ๋“œ๋ฐฑ์„ ์ฃผ๋Š” ๊ฒƒ์ด ๋‹ค์–‘ํ•œ ์ธ๊ฐ„-๊ธฐ๊ณ„ ์ƒํ˜ธ์ž‘์šฉ ๋ถ„์•ผ์—์„œ ๋„๋ฆฌ ์—ฐ๊ตฌ๋˜์–ด ์™”์œผ๋ฏ€๋กœ, ๋จผ์ € ์†๋ ์ด‰๊ฐ ์žฅ๋น„์™€ ํ•จ๊ป˜ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋Š” ์† ๋™์ž‘ ์ถ”์  ๋ชจ๋“ˆ์„ ๊ฐœ๋ฐœํ•œ๋‹ค. ์ด ์†๋ ์ด‰๊ฐ ์žฅ๋น„๋Š” ์ž๊ธฐ์žฅ ์™ธ๋ž€์„ ์ผ์œผ์ผœ ์ฐฉ์šฉํ˜• ๊ธฐ์ˆ ์—์„œ ํ”ํžˆ ์‚ฌ์šฉ๋˜๋Š” ์ง€์ž๊ธฐ ์„ผ์„œ๋ฅผ ๊ต๋ž€ํ•˜๋Š”๋ฐ, ์ด๋ฅผ ์ ์ ˆํ•œ ์‚ฌ๋žŒ ์†์˜ ํ•ด๋ถ€ํ•™์  ํŠน์„ฑ๊ณผ ๊ด€์„ฑ ์„ผ์„œ/์ง€์ž๊ธฐ ์„ผ์„œ/์†Œํ”„ํŠธ ์„ผ์„œ์˜ ์ ์ ˆํ•œ ํ™œ์šฉ์„ ํ†ตํ•ด ํ•ด๊ฒฐํ•œ๋‹ค. ์ด๋ฅผ ํ™•์žฅํ•˜์—ฌ ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š”, ์ด‰๊ฐ ์žฅ๋น„ ์ฐฉ์šฉ ์‹œ ๋ฟ ์•„๋‹ˆ๋ผ ๋ชจ๋“  ์žฅ๋น„ ์ฐฉ์šฉ / ํ™˜๊ฒฝ / ๋ฌผ์ฒด์™€์˜ ์ƒํ˜ธ์ž‘์šฉ ์‹œ์—๋„ ์‚ฌ์šฉ ๊ฐ€๋Šฅํ•œ ์ƒˆ๋กœ์šด ์† ๋™์ž‘ ์ถ”์  ๊ธฐ์ˆ ์„ ์ œ์•ˆํ•œ๋‹ค. ๊ธฐ์กด์˜ ์† ๋™์ž‘ ์ถ”์  ๊ธฐ์ˆ ๋“ค์€ ๊ฐ€๋ฆผ ํ˜„์ƒ (์˜์ƒ ๊ธฐ๋ฐ˜ ๊ธฐ์ˆ ), ์ง€์ž๊ธฐ ์™ธ๋ž€ (๊ด€์„ฑ/์ง€์ž๊ธฐ ์„ผ์„œ ๊ธฐ๋ฐ˜ ๊ธฐ์ˆ ), ๋ฌผ์ฒด์™€์˜ ์ ‘์ด‰ (์†Œํ”„ํŠธ ์„ผ์„œ ๊ธฐ๋ฐ˜ ๊ธฐ์ˆ ) ๋“ฑ์œผ๋กœ ์ธํ•ด ์ œํ•œ๋œ ํ™˜๊ฒฝ์—์„œ ๋ฐ–์— ์‚ฌ์šฉํ•˜์ง€ ๋ชปํ•œ๋‹ค. ์ด๋ฅผ ์œ„ํ•ด ๋งŽ์€ ๋ฌธ์ œ๋ฅผ ์ผ์œผํ‚ค๋Š” ์ง€์ž๊ธฐ ์„ผ์„œ ์—†์ด ์ƒ๋ณด์ ์ธ ํŠน์„ฑ์„ ์ง€๋‹ˆ๋Š” ๊ด€์„ฑ ์„ผ์„œ์™€ ์˜์ƒ ์„ผ์„œ๋ฅผ ์œตํ•ฉํ•˜๊ณ , ์ด๋•Œ ์ž‘์€ ๊ณต๊ฐ„์— ๋‹ค ์ž์œ ๋„์˜ ์›€์ง์ž„์„ ๊ฐ–๋Š” ์† ๋™์ž‘์„ ์ถ”์ ํ•˜๊ธฐ ์œ„ํ•ด ๋‹ค์ˆ˜์˜ ๊ตฌ๋ถ„๋˜์ง€ ์•Š๋Š” ๋งˆ์ปค๋“ค์„ ์‚ฌ์šฉํ•œ๋‹ค. ์ด ๋งˆ์ปค์˜ ๊ตฌ๋ถ„ ๊ณผ์ • (correspondence search)๋ฅผ ์œ„ํ•ด ๊ธฐ์กด์˜ ์•ฝ๊ฒฐํ•ฉ (loosely-coupled) ๊ธฐ๋ฐ˜์ด ์•„๋‹Œ ๊ฐ•๊ฒฐํ•ฉ (tightly-coupled ๊ธฐ๋ฐ˜ ์„ผ์„œ ์œตํ•ฉ ๊ธฐ์ˆ ์„ ์ œ์•ˆํ•˜๊ณ , ์ด๋ฅผ ํ†ตํ•ด ์ง€์ž๊ธฐ ์„ผ์„œ ์—†์ด ์ •ํ™•ํ•œ ์† ๋™์ž‘์ด ๊ฐ€๋Šฅํ•  ๋ฟ ์•„๋‹ˆ๋ผ ์ฐฉ์šฉํ˜• ์„ผ์„œ๋“ค์˜ ์ •ํ™•์„ฑ/ํŽธ์˜์„ฑ์— ๋ฌธ์ œ๋ฅผ ์ผ์œผํ‚ค๋˜ ์„ผ์„œ ๋ถ€์ฐฉ ์˜ค์ฐจ / ์‚ฌ์šฉ์ž์˜ ์† ๋ชจ์–‘ ๋“ฑ์„ ์ž๋™์œผ๋กœ ์ •ํ™•ํžˆ ๋ณด์ •ํ•œ๋‹ค. ์ด ์ œ์•ˆ๋œ ์˜์ƒ-๊ด€์„ฑ ์„ผ์„œ ์œตํ•ฉ ๊ธฐ์ˆ  (Visual-Inertial Skeleton Tracking (VIST)) ์˜ ๋›ฐ์–ด๋‚œ ์„ฑ๋Šฅ๊ณผ ๊ฐ•๊ฑด์„ฑ์ด ๋‹ค์–‘ํ•œ ์ •๋Ÿ‰/์ •์„ฑ ์‹คํ—˜์„ ํ†ตํ•ด ๊ฒ€์ฆ๋˜์—ˆ๊ณ , ์ด๋Š” VIST์˜ ๋‹ค์–‘ํ•œ ์ผ์ƒํ™˜๊ฒฝ์—์„œ ๊ธฐ์กด ์‹œ์Šคํ…œ์ด ๊ตฌํ˜„ํ•˜์ง€ ๋ชปํ•˜๋˜ ์† ๋™์ž‘ ์ถ”์ ์„ ๊ฐ€๋Šฅ์ผ€ ํ•จ์œผ๋กœ์จ, ๋งŽ์€ ์ธ๊ฐ„-๊ธฐ๊ณ„ ์ƒํ˜ธ์ž‘์šฉ ๋ถ„์•ผ์—์„œ์˜ ๊ฐ€๋Šฅ์„ฑ์„ ๋ณด์—ฌ์ค€๋‹ค.1 Introduction 1 1.1. Motivation 1 1.2. Related Work 5 1.3. Contribution 12 2 Detection Threshold of Hand Tracking Error 16 2.1. Motivation 16 2.2. Experimental Environment 20 2.2.1. Hardware Setup 21 2.2.2. Virtual Environment Rendering 23 2.2.3. HMD Calibration 23 2.3. Identifying the Detection Threshold of Tracking Error 26 2.3.1. Experimental Setup 27 2.3.2. Procedure 27 2.3.3. Experimental Result 31 2.4. Enlarging the Detection Threshold of Tracking Error by Haptic Feedback 31 2.4.1. Experimental Setup 31 2.4.2. Procedure 32 2.4.3. Experimental Result 34 2.5. Discussion 34 3 Wearable Finger Tracking Module for Haptic Interaction 38 3.1. Motivation 38 3.2. Development of Finger Tracking Module 42 3.2.1. Hardware Setup 42 3.2.2. Tracking algorithm 45 3.2.3. Calibration method 48 3.3. Evaluation for VR Haptic Interaction Task 50 3.3.1. Quantitative evaluation of FTM 50 3.3.2. Implementation of Wearable Cutaneous Haptic Interface 51 3.3.3. Usability evaluation for VR peg-in-hole task 53 3.4. Discussion 57 4 Visual-Inertial Skeleton Tracking for Human Hand 59 4.1. Motivation 59 4.2. Hardware Setup and Hand Models 62 4.2.1. Human Hand Model 62 4.2.2. Wearable Sensor Glove 62 4.2.3. Stereo Camera 66 4.3. Visual Information Extraction 66 4.3.1. Marker Detection in Raw Images 68 4.3.2. Cost Function for Point Matching 68 4.3.3. Left-Right Stereo Matching 69 4.4. IMU-Aided Correspondence Search 72 4.5. Filtering-based Visual-Inertial Sensor Fusion 76 4.5.1. EKF States for Hand Tracking and Auto-Calibration 78 4.5.2. Prediction with IMU Information 79 4.5.3. Correction with Visual Information 82 4.5.4. Correction with Anatomical Constraints 84 4.6. Quantitative Evaluation for Free Hand Motion 87 4.6.1. Experimental Setup 87 4.6.2. Procedure 88 4.6.3. Experimental Result 90 4.7. Quantitative and Comparative Evaluation for Challenging Hand Motion 95 4.7.1. Experimental Setup 95 4.7.2. Procedure 96 4.7.3. Experimental Result 98 4.7.4. Performance Comparison with Existing Methods for Challenging Hand Motion 101 4.8. Qualitative Evaluation for Real-World Scenarios 105 4.8.1. Visually Complex Background 105 4.8.2. Object Interaction 106 4.8.3. Wearing Fingertip Cutaneous Haptic Devices 109 4.8.4. Outdoor Environment 111 4.9. Discussion 112 5 Conclusion 116 References 124 Abstract (in Korean) 139 Acknowledgment 141๋ฐ•

    Biosignalโ€based humanโ€“machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Humanโ€“Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignalโ€based HMIs for assistance and rehabilitation to outline stateโ€ofโ€theโ€art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, fullโ€text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An everโ€growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIsโ€™ complex-ity, so their usefulness should be carefully evaluated for the specific application

    Pelaksanaan pembelajaran berasaskan kerja politeknik bersama industri

    Get PDF
    embelajaran Berasaskan Kerja (PBK) merupakan satu kaedah pembelajaran yang menggabungkan pembelajaran teori dan amali secara serentak dalam lapangan kerja sebenar, dengan tujuan untuk melahirkan graduan yang memiliki nilai kebolehkerjaan. Walaupun kaedah ini telah lama dilaksanakan di negara maju seperti Amerika Syarikat dan United Kingdom, tetapi di Malaysia ianya baru dilaksanakan pada tahun 2007 dan hanya melibatkan beberapa buah kolej komuniti pada peringkat awal. Walau bagaimanapun pada tahun 2010, pelaksanaan PBK telah dihentikan di kolej komuniti, dan dipindahkan di politeknik. Antara isu yang berlaku dalam pelaksanaan PBK politeknik semasa dalam industri ialah konsep pelaksanaan PBK, gaya pengajaran dan pembelajaran, kaedah penilaian, hubungan politeknik dengan industri, keseragaman konsep pelaksanaan PBK, isu dan cabaran dalam pelaksanaan PBK, dan perbezaan kaedah pelaksanaan PBK antara politeknik dengan kolej komuniti. Oleh itu, tujuan kajian ini dijalankan ialah untuk meneroka, memahami dan menjelaskan pelaksanaan PBK politeknik bersama industri. Kajian ini dijalankan menggunakan metodologi kajian kes kualitatif. Proses pengumpulan data di lapangan kajian dilaksanakan selama setahun menggunakan tek:nik temubual, pemerhatian dan analisis dokumen. Strategi persampelan variasi maksima, teknik persampelan snowball dan jenis persampelan bertujuan digunakan. Peserta kajian adalah daripada kalangan pengurusan dan pensyarah penyelaras PBK, penyelia industri dan pelajar yang terlibat dengan PBK. Dapatan kajian menunjukkan bahawa pelaksanaan PBK politeknik bersama industri berlaku banyak penambahbaikan dalam pelaksanaannya jika dibandingkan dengan pelaksanaan PBK di kolej komuniti sebelum ini, namun terdapat beberapa isu yang wujud, iaitu melibatkan kurikulum PBK yang tidak selari dengan dasar industri dan kelemahan penyelia industri dalam pengajaran dan pembelajaran

    Fused mechanomyography and inertial measurement for human-robot interface

    Get PDF
    Human-Machine Interfaces (HMI) are the technology through which we interact with the ever-increasing quantity of smart devices surrounding us. The fundamental goal of an HMI is to facilitate robot control through uniting a human operator as the supervisor with a machine as the task executor. Sensors, actuators, and onboard intelligence have not reached the point where robotic manipulators may function with complete autonomy and therefore some form of HMI is still necessary in unstructured environments. These may include environments where direct human action is undesirable or infeasible, and situations where a robot must assist and/or interface with people. Contemporary literature has introduced concepts such as body-worn mechanical devices, instrumented gloves, inertial or electromagnetic motion tracking sensors on the arms, head, or legs, electroencephalographic (EEG) brain activity sensors, electromyographic (EMG) muscular activity sensors and camera-based (vision) interfaces to recognize hand gestures and/or track arm motions for assessment of operator intent and generation of robotic control signals. While these developments offer a wealth of future potential their utility has been largely restricted to laboratory demonstrations in controlled environments due to issues such as lack of portability and robustness and an inability to extract operator intent for both arm and hand motion. Wearable physiological sensors hold particular promise for capture of human intent/command. EMG-based gesture recognition systems in particular have received significant attention in recent literature. As wearable pervasive devices, they offer benefits over camera or physical input systems in that they neither inhibit the user physically nor constrain the user to a location where the sensors are deployed. Despite these benefits, EMG alone has yet to demonstrate the capacity to recognize both gross movement (e.g. arm motion) and finer grasping (e.g. hand movement). As such, many researchers have proposed fusing muscle activity (EMG) and motion tracking e.g. (inertial measurement) to combine arm motion and grasp intent as HMI input for manipulator control. However, such work has arguably reached a plateau since EMG suffers from interference from environmental factors which cause signal degradation over time, demands an electrical connection with the skin, and has not demonstrated the capacity to function out of controlled environments for long periods of time. This thesis proposes a new form of gesture-based interface utilising a novel combination of inertial measurement units (IMUs) and mechanomyography sensors (MMGs). The modular system permits numerous configurations of IMU to derive body kinematics in real-time and uses this to convert arm movements into control signals. Additionally, bands containing six mechanomyography sensors were used to observe muscular contractions in the forearm which are generated using specific hand motions. This combination of continuous and discrete control signals allows a large variety of smart devices to be controlled. Several methods of pattern recognition were implemented to provide accurate decoding of the mechanomyographic information, including Linear Discriminant Analysis and Support Vector Machines. Based on these techniques, accuracies of 94.5% and 94.6% respectively were achieved for 12 gesture classification. In real-time tests, accuracies of 95.6% were achieved in 5 gesture classification. It has previously been noted that MMG sensors are susceptible to motion induced interference. The thesis also established that arm pose also changes the measured signal. This thesis introduces a new method of fusing of IMU and MMG to provide a classification that is robust to both of these sources of interference. Additionally, an improvement in orientation estimation, and a new orientation estimation algorithm are proposed. These improvements to the robustness of the system provide the first solution that is able to reliably track both motion and muscle activity for extended periods of time for HMI outside a clinical environment. Application in robot teleoperation in both real-world and virtual environments were explored. With multiple degrees of freedom, robot teleoperation provides an ideal test platform for HMI devices, since it requires a combination of continuous and discrete control signals. The field of prosthetics also represents a unique challenge for HMI applications. In an ideal situation, the sensor suite should be capable of detecting the muscular activity in the residual limb which is naturally indicative of intent to perform a specific hand pose and trigger this post in the prosthetic device. Dynamic environmental conditions within a socket such as skin impedance have delayed the translation of gesture control systems into prosthetic devices, however mechanomyography sensors are unaffected by such issues. There is huge potential for a system like this to be utilised as a controller as ubiquitous computing systems become more prevalent, and as the desire for a simple, universal interface increases. Such systems have the potential to impact significantly on the quality of life of prosthetic users and others.Open Acces
    • โ€ฆ
    corecore