404 research outputs found

    Prototipo control de vehículo robot por señales EMG

    Get PDF
    The EMG (Electromyography) signals are basically electrical pulses emitted by the nerves and muscles of the extremities of the human body, (for example the biceps of the arm) which are obtained by means of electrodes. These signals can be amplified and used in different activities or jobs. In the present investigation the EMG signals, acquired from the biceps of the arm to be used, are used by means of three surface electrodes specifically placed to be able to acquire the signals transmitted by the biceps muscles, which with the use of a differential amplifier will be measured and amplified the difference in voltage between the three electrodes that is placed in the muscle, taking into account that the signals are between the ranges of µV and less than 10mV. In the following stages the preparation of the signal is done to connect it to a microcontroller. In this case, the Arduino card will be used, where the already amplified signal is processed and transmitted wirelessly with the help of NRF24L01, which has a range of 1000 meters away from the control system in the Robot Vehicle. In this comes the variation of tension depending on the deflection of the arm and therefore the Robot vehicle accelerates or slows down depending on the signal emitted by the arm amplifier system. Finally, the prototype is adjusted and the fundamental mechanical-electronic characterizations for the different control movements are established.Las señales EMG (Electromiografía) son básicamente pulsos eléctricos emitidos por los nervios y músculos de las extremidades del cuerpo humano, (ejemplo el bíceps del brazo) los que se obtienen por medio de electrodos. Estas señales se pueden amplificar y ser utilizadas en diferentes actividades o trabajos. En la presente investigación se utilizan las señales EMG, adquiridas del bíceps del brazo a utilizar, por medio de tres electrodos superficiales colocados específicamente para poder adquirir las señales trasmitidas por los músculos del bíceps, que con la utilización de un amplificador diferencial se medirá y amplificará la diferencia de voltaje entre los tres electrodos que se coloca en el músculo, teniendo en cuenta que las señales se encuentran entre los rangos de µV y menores de 10mV. En las siguientes etapas se realiza la preparación de la señal para conectarla a un microcontrolador. En este caso se utilizará la tarjeta Arduino, en donde se procesa la señal ya amplificada y se trasmite inalámbricamente con la ayuda del NRF24L01, que tiene un alcance de1000 metros de distancia al sistema de control que está en el Vehículo Robot. En este llega la variación de tensión dependiendo de la deflexión del brazo y por lo tanto el vehículo Robot se acelera o desacelera dependiendo de la señal emitida por el sistema amplificador del brazo. Finalmente se ajusta el prototipo y se establecen las caracterizaciones fundamentales mecánicas-electrónicas para los diferentes movimientos de control

    Biosleeve Human-Machine Interface

    Get PDF
    Systems and methods for sensing human muscle action and gestures in order to control machines or robotic devices are disclosed. One exemplary system employs a tight fitting sleeve worn on a user arm and including a plurality of electromyography (EMG) sensors and at least one inertial measurement unit (IMU). Power, signal processing, and communications electronics may be built into the sleeve and control data may be transmitted wirelessly to the controlled machine or robotic device

    Fused mechanomyography and inertial measurement for human-robot interface

    Get PDF
    Human-Machine Interfaces (HMI) are the technology through which we interact with the ever-increasing quantity of smart devices surrounding us. The fundamental goal of an HMI is to facilitate robot control through uniting a human operator as the supervisor with a machine as the task executor. Sensors, actuators, and onboard intelligence have not reached the point where robotic manipulators may function with complete autonomy and therefore some form of HMI is still necessary in unstructured environments. These may include environments where direct human action is undesirable or infeasible, and situations where a robot must assist and/or interface with people. Contemporary literature has introduced concepts such as body-worn mechanical devices, instrumented gloves, inertial or electromagnetic motion tracking sensors on the arms, head, or legs, electroencephalographic (EEG) brain activity sensors, electromyographic (EMG) muscular activity sensors and camera-based (vision) interfaces to recognize hand gestures and/or track arm motions for assessment of operator intent and generation of robotic control signals. While these developments offer a wealth of future potential their utility has been largely restricted to laboratory demonstrations in controlled environments due to issues such as lack of portability and robustness and an inability to extract operator intent for both arm and hand motion. Wearable physiological sensors hold particular promise for capture of human intent/command. EMG-based gesture recognition systems in particular have received significant attention in recent literature. As wearable pervasive devices, they offer benefits over camera or physical input systems in that they neither inhibit the user physically nor constrain the user to a location where the sensors are deployed. Despite these benefits, EMG alone has yet to demonstrate the capacity to recognize both gross movement (e.g. arm motion) and finer grasping (e.g. hand movement). As such, many researchers have proposed fusing muscle activity (EMG) and motion tracking e.g. (inertial measurement) to combine arm motion and grasp intent as HMI input for manipulator control. However, such work has arguably reached a plateau since EMG suffers from interference from environmental factors which cause signal degradation over time, demands an electrical connection with the skin, and has not demonstrated the capacity to function out of controlled environments for long periods of time. This thesis proposes a new form of gesture-based interface utilising a novel combination of inertial measurement units (IMUs) and mechanomyography sensors (MMGs). The modular system permits numerous configurations of IMU to derive body kinematics in real-time and uses this to convert arm movements into control signals. Additionally, bands containing six mechanomyography sensors were used to observe muscular contractions in the forearm which are generated using specific hand motions. This combination of continuous and discrete control signals allows a large variety of smart devices to be controlled. Several methods of pattern recognition were implemented to provide accurate decoding of the mechanomyographic information, including Linear Discriminant Analysis and Support Vector Machines. Based on these techniques, accuracies of 94.5% and 94.6% respectively were achieved for 12 gesture classification. In real-time tests, accuracies of 95.6% were achieved in 5 gesture classification. It has previously been noted that MMG sensors are susceptible to motion induced interference. The thesis also established that arm pose also changes the measured signal. This thesis introduces a new method of fusing of IMU and MMG to provide a classification that is robust to both of these sources of interference. Additionally, an improvement in orientation estimation, and a new orientation estimation algorithm are proposed. These improvements to the robustness of the system provide the first solution that is able to reliably track both motion and muscle activity for extended periods of time for HMI outside a clinical environment. Application in robot teleoperation in both real-world and virtual environments were explored. With multiple degrees of freedom, robot teleoperation provides an ideal test platform for HMI devices, since it requires a combination of continuous and discrete control signals. The field of prosthetics also represents a unique challenge for HMI applications. In an ideal situation, the sensor suite should be capable of detecting the muscular activity in the residual limb which is naturally indicative of intent to perform a specific hand pose and trigger this post in the prosthetic device. Dynamic environmental conditions within a socket such as skin impedance have delayed the translation of gesture control systems into prosthetic devices, however mechanomyography sensors are unaffected by such issues. There is huge potential for a system like this to be utilised as a controller as ubiquitous computing systems become more prevalent, and as the desire for a simple, universal interface increases. Such systems have the potential to impact significantly on the quality of life of prosthetic users and others.Open Acces

    Body swarm interface (BOSI) : controlling robotic swarms using human bio-signals

    Get PDF
    Traditionally robots are controlled using devices like joysticks, keyboards, mice and other similar human computer interface (HCI) devices. Although this approach is effective and practical for some cases, it is restrictive only to healthy individuals without disabilities, and it also requires the user to master the device before its usage. It becomes complicated and non-intuitive when multiple robots need to be controlled simultaneously with these traditional devices, as in the case of Human Swarm Interfaces (HSI). This work presents a novel concept of using human bio-signals to control swarms of robots. With this concept there are two major advantages: Firstly, it gives amputees and people with certain disabilities the ability to control robotic swarms, which has previously not been possible. Secondly, it also gives the user a more intuitive interface to control swarms of robots by using gestures, thoughts, and eye movement. We measure different bio-signals from the human body including Electroencephalography (EEG), Electromyography (EMG), Electrooculography (EOG), using off the shelf products. After minimal signal processing, we then decode the intended control action using machine learning techniques like Hidden Markov Models (HMM) and K-Nearest Neighbors (K-NN). We employ formation controllers based on distance and displacement to control the shape and motion of the robotic swarm. Comparison for ground truth for thoughts and gesture classifications are done, and the resulting pipelines are evaluated with both simulations and hardware experiments with swarms of ground robots and aerial vehicles

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application

    EMG ve jiroskop verileri ile endüstriyel robot kolunun gerçek zamanlı kontrolü

    Get PDF
    06.03.2018 tarihli ve 30352 sayılı Resmi Gazetede yayımlanan “Yükseköğretim Kanunu İle Bazı Kanun Ve Kanun Hükmünde Kararnamelerde Değişiklik Yapılması Hakkında Kanun” ile 18.06.2018 tarihli “Lisansüstü Tezlerin Elektronik Ortamda Toplanması, Düzenlenmesi ve Erişime Açılmasına İlişkin Yönerge” gereğince tam metin erişime açılmıştır.Gelişen teknoloji ile giyilebilir cihazlar üzerine geliştirme çalışmaları yapılmakta ve ticari ürünler piyasaya çıkmaya devam etmektedir. Bu çalışmaların önemli bir kısmı insan vucüdundaki hareketleri algılayabilen giyilebilir sensörler üzerine odaklanmaktadır. Bu çalışmada Thalmic Labs tarafından üretilen Myo Armband ürünü ile insan kol hareketlerinin algılanması ve endüstriyel robot kolunu kontrol etmesi ile bir İnsan Robot Arayüzü geliştirilmiştir. Myo Armband bileklik şeklinde olup üzerinde bulunan EMG (Elektromiyografi) ve jiroskop sensörleri ile kolun hareketinin algılanmasında yarımcı olmaktadır. Myo Armband ile bilgisayar sistemi arasında kablosuz bağlantı kurularak ham EMG ve jiroskop verilerinin gerçek zamanlı olarak bilgisayara gönderilmesi mümkindür. Pratik çalışma için ABB firması tarafından üretilen IRB120 endüstriyel robotu kullanılmıştır. IRB120 endüstriyel robotu kendi kontrolörü dışında ABB firması tarafından verilen yazılım geliştirme kiti kullanılarak ethernet haberleşmesi ile de kontrol edilebilmektedir. Geliştirilen yazılım ile EMG ve jiroskop verilerinden elde edilen veriler hareket ve konum bilgilerine dönüştürülerek ethernet üzerinden gerçek zamanlı gönderilmesi ile endüstriyel robotun insan hareketlerini takip etmesi sağlanmıştır. Bu çalışma sonucu olarak gerçekleştirilen yazılım ile yeni HMI sistemi endüstriyel robot kollarının ucuz ve kolay yollardan kontrolünü mümkün kılmaktadır.Development efforts rising on wearable devices and commercial products has begun confront commercial market. This study is mainly focused on wearable sensors that can be used to measure human movements. In this study, a Human Robot Interface (HRI) was developed with the Myo Armband product by Thalmic Labs, with the perception of human arm movements. Myo Armband is bracelet shaped device and it helps to detect movements of the arm thanks to EMG (Elekctromyography ) and gyroscope sensors on it. By establishing a wireless connection between Myo Armband and the computer system it is possible to send raw EMG and Gyroscope data in real time to the computer. IRB120 industrial robot that produced by ABB Robotics company is used for practical tests. IRB120 industrial robot can be controlled besides its own controller. In study, EMG and gyroscope daha has been sent to robot arm over ethernet in order to follow the human movements. As a result of this work, the new HMI system makes it possible to control industrial robot arms in a cheap and easy way

    Advances in Human-Robot Interaction

    Get PDF
    Rapid advances in the field of robotics have made it possible to use robots not just in industrial automation but also in entertainment, rehabilitation, and home service. Since robots will likely affect many aspects of human existence, fundamental questions of human-robot interaction must be formulated and, if at all possible, resolved. Some of these questions are addressed in this collection of papers by leading HRI researchers

    Développement d’algorithmes et d’outils logiciels pour l’assistance technique et le suivi en réadaptation

    Get PDF
    Ce mémoire présente deux projets de développement portant sur des algorithmes et des outils logiciels offrant des solutions pratiques à des problématiques courantes rencontrées en réadaptation. Le premier développement présenté est un algorithme de correspondance de séquence qui s’intègre à des interfaces de contrôle couramment utilisées en pratique. L’implémentation de cet algorithme offre une solution flexible pouvant s’adapter à n’importe quel utilisateur de technologies d’assistances. Le contrôle de tels appareils représente un défi de taille puisqu’ils ont, la plupart du temps, une dimensionnalité élevée (c-à-d. plusieurs degrés de liberté, modes ou commandes) et sont maniés à l’aide d’interfaces basées sur de capteurs de faible dimensionnalité offrant donc très peu de commandes physiques distinctes pour l’utilisateur. L’algorithme proposé se base donc sur de la reconnaissance de courts signaux temporels ayant la possibilité d’être agencés en séquences. L’éventail de combinaisons possibles augmente ainsi la dimensionnalité de l’interface. Deux applications de l’algorithme sont développées et testées. La première avec une interface de contrôle par le souffle pour un bras robotisé et la seconde pour une interface de gestes de la main pour le contrôle du clavier-souris d’un ordinateur. Le second développement présenté dans ce mémoire porte plutôt sur la collecte et l’analyse de données en réadaptation. Que ce soit en milieux cliniques, au laboratoires ou au domicile, nombreuses sont les situations où l’on souhaite récolter des données. La solution pour cette problématique se présente sous la forme d’un écosystème d’applications connectées incluant serveur et applications web, mobiles et embarquée. Ces outils logiciels sont développés sur mesure et offrent un procédé unique, peu coûteux, léger et rapide pour la collecte, la visualisation et la récupération de données. Ce manuscrit détaille une première version en décrivant l’architecture employée, les technologies utilisées et les raisons qui ont mené à ces choix tout en guidant les futures itérations.This Master’s thesis presents two development projects about algorithms and software tools providing practical solutions to commonly faced situations in rehabilitation context. The first project is the development of a sequence matching algorithm that can be integrated to the most commonly used control interfaces. The implementation of this algorithm provides a flexible solution that can be adapted to any assistive technology user. The control of such devices represents a challenge since their dimensionality is high (i.e., many degrees of freedom, modes, commands) and they are controlled with interfaces based on low-dimensionality sensors. Thus, the number of actual physical commands that the user can perform is low. The proposed algorithm is based on short time signals that can be organized into sequences. The multiple possible combinations then contribute to increasing the dimensionality of the interface. Two applications of the algorithm have been developed and tested. The first is a sip-and-puff control interface for a robotic assistive arm and the second is a hand gesture interface for the control of a computer’s mouse and keyboard. The second project presented in this document addresses the issue of collecting and analyzing data. In a rehabilitation’s clinical or laboratory environment, or at home, there are many situations that require gathering data. The proposed solution to this issue is a connected applications ecosystem that includes a web server and mobile, web and embedded applications. This custom-made software offers a unique, inexpensive, lightweight and fast workflow to visualize and retrieve data. The following document describes a first version by elaborating on the architecture, the technologies used, the reasons for those choices, and guide the next iterations
    corecore