6 research outputs found

    Human Arm Movement Detection Using Low-Cost Sensors for Controlling Robotic Arm

    Get PDF
    This paper presents the development of a wearable device to detect the human arm movement for controlling the robotic arm remotely. The proposed system employs the simple and low-cost sensors consist of seven potentiometers and one flex sensor. A small size Arduino Nano microcontroller is employed as the processing unit to process the analog signal from the sensor into the digital value which is then sent to the robotic arm via the Bluetooth communication. The experimental results show that the system achieves good linearity between the human arm movement and the robotic arm movement. The average linearity error of the whole sensors that represents the deviation of sensor output from the ideal one is 2.20%. Since the sensors are simple and low cost, the error could be acceptable for the real implementation

    Building a low-cost motion capture suit for animation

    Get PDF
    Capstone Project submitted to the Department of Engineering, Ashesi University in partial fulfillment of the requirements for the award of Bachelor of Science degree in Computer Engineering, May 2021Motion capture has become a major player in the film and animation industry. Mimicking natural and subtle movements of humans and animals has made animation a lot more believable and engaging. This paper presents a low-cost design of a motion capture unit based on the ESP32 and a 9-DOF inertial sensor Bno055. A calibration method based on a standing posture is used. The sensors data are retrieved via the serial port of the Arduino IDE and sent to Blender, a 3D program, to display real-time motion. If a believable performance can be achieved with low-cost inertial sensors, then the overall cost of existing motion capture technology can be reduced.Ashesi Universit

    Human robot ınteraction network design with wearable wireless MIMU sensors for upper extremity exoskeleton robot

    Get PDF
    Bu araştırma kapsamında insan vücuduna uyumlu, insan hareketlerini destekleyen iki serbestlik dereceli bir üst-ekstremite dış iskelet robot sisteminin kontrolü için giyilebilir kablosuz sensörler MIMU (ivmeölçer, jiroskop) vasıtası ile insan robot etkileşim ağı tasarımı gerçekleştirilmiştir. Kişinin üst ve alt kol uzuvlarına bağlı iki adet MIMU sensörden açısal ivmelenme, jiroskop ve manyetometre bilgileri alınıp, AHRS (Attitude and Heading Reference Systems) algoritması ile bu sensör verileri bütünleştirilip kişinin üst ekstremite hareketine ilişkilin (üst kol, alt kol) kuaternion yönelim matrisi hesaplanmıştır. Kinematik analiz ile de kuaternion matrisi verileri kullanılarak omuz ve dirsek eklemlerine ait Euler yönelim açıları (x, y, z eksenleri için) hesaplanmıştır. Geliştirilen etkileşim ağı ile laboratuvar olanakları ile tasarlanan ve imalatı yapılan iki serbestlik dereceli prototip üst ekstremite dış iskelet robot kolun gerçek zamanlı hareket kontrolü gerçekleştirilmiştir. Sonuç olarak, kullanıcı kişi kolunu hareket ettirirken, dış iskelet robotta senkronize olarak aynı hareketi gerçekleştirmektedir.Within the scope of this research, human robot interaction network design was carried out by means of wearable wireless sensors MIMU (accelerometer, gyroscope, magnetometer) for the control of a two-degree upper-extremity exoskeletal robot system compatible with human body and supporting human movements. Angular acceleration, gyroscope information was obtained from two MIMU sensors connected to the upper and lower limbs of the subject, and AHRS (Attitude and Heading Reference Systems) algorithm was integrated with these sensor data and the upper extremity movement (upper arm, lower arm) quaternion orientation matrix was calculated. Euler orientation angles (for x, y, z axes) of shoulder and elbow joints were calculated by using kinematic analysis. With the developed interaction network, real time motion control of two degrees of freedom prototype upper extremity exoskeleton robot arm which is designed and manufactured with laboratory facilities was realized. As a result, the user performs the same movement synchronously in the exoskeleton robot as the person moves the arm

    Interactive remote robotic arm control with hand motions

    Get PDF
    Geographically-separated people are now connected by smart devices and networks to enjoy remote human interactions. However, current online interactions are still confined to a virtual space. Extending pure virtual interactions to the physical world requires multidisciplinary research efforts, including sensing, robot control, networking, and kinematics mapping. This paper introduces a remote motion-controlled robotic arm framework by integrating these techniques, which allows a user to control a far-end robotic arm simply by hand motions. In the meanwhile, the robotic arm follows the user’s hand to perform tasks and sends back its live states to the user to form a control loop. Furthermore, we explore using cheap robotic arms and off-the-shelf motion capture devices to facilitate the widespread use of the platform in people’s daily life. we implement a test bed that connects two US states for the remote control study. We investigate the different latency components that affect the user’s remote control experience, conduct a comparative study between the remote control and local control, and evaluate the platform with both free-form in-air hand gestures and hand movements following reference curves. We also are investigating the possibility of using VR (Virtual Reality) headsets to enhance first-person vision presence and control allowing for smoother robot teleoperation. Finally, a user study is conducted to find out user satisfaction with different setups while completing a set of tasks to achieve an intuitive and easy-to-use platfor

    Survey of Motion Tracking Methods Based on Inertial Sensors: A Focus on Upper Limb Human Motion

    Get PDF
    Motion tracking based on commercial inertial measurements units (IMUs) has been widely studied in the latter years as it is a cost-effective enabling technology for those applications in which motion tracking based on optical technologies is unsuitable. This measurement method has a high impact in human performance assessment and human-robot interaction. IMU motion tracking systems are indeed self-contained and wearable, allowing for long-lasting tracking of the user motion in situated environments. After a survey on IMU-based human tracking, five techniques for motion reconstruction were selected and compared to reconstruct a human arm motion. IMU based estimation was matched against motion tracking based on the Vicon marker-based motion tracking system considered as ground truth. Results show that all but one of the selected models perform similarly (about 35 mm average position estimation error)

    Une méthode de mesure du mouvement humain pour la programmation par démonstration

    Full text link
    Programming by demonstration (PbD) is an intuitive approach to impart a task to a robot from one or several demonstrations by the human teacher. The acquisition of the demonstrations involves the solution of the correspondence problem when the teacher and the learner differ in sensing and actuation. Kinesthetic guidance is widely used to perform demonstrations. With such a method, the robot is manipulated by the teacher and the demonstrations are recorded by the robot's encoders. In this way, the correspondence problem is trivial but the teacher dexterity is afflicted which may impact the PbD process. Methods that are more practical for the teacher usually require the identification of some mappings to solve the correspondence problem. The demonstration acquisition method is based on a compromise between the difficulty of identifying these mappings, the level of accuracy of the recorded elements and the user-friendliness and convenience for the teacher. This thesis proposes an inertial human motion tracking method based on inertial measurement units (IMUs) for PbD for pick-and-place tasks. Compared to kinesthetic guidance, IMUs are convenient and easy to use but can present a limited accuracy. Their potential for PbD applications is investigated. To estimate the trajectory of the teacher's hand, 3 IMUs are placed on her/his arm segments (arm, forearm and hand) to estimate their orientations. A specific method is proposed to partially compensate the well-known drift of the sensor orientation estimation around the gravity direction by exploiting the particular configuration of the demonstration. This method, called heading reset, is based on the assumption that the sensor passes through its original heading with stationary phases several times during the demonstration. The heading reset is implemented in an integration and vector observation algorithm. Several experiments illustrate the advantages of this heading reset. A comprehensive inertial human hand motion tracking (IHMT) method for PbD is then developed. It includes an initialization procedure to estimate the orientation of each sensor with respect to the human arm segment and the initial orientation of the sensor with respect to the teacher attached frame. The procedure involves a rotation and a static position of the extended arm. The measurement system is thus robust with respect to the positioning of the sensors on the segments. A procedure for estimating the position of the human teacher relative to the robot and a calibration procedure for the parameters of the method are also proposed. At the end, the error of the human hand trajectory is measured experimentally and is found in an interval between 28.528.5 mm and 61.861.8 mm. The mappings to solve the correspondence problem are identified. Unfortunately, the observed level of accuracy of this IHMT method is not sufficient for a PbD process. In order to reach the necessary level of accuracy, a method is proposed to correct the hand trajectory obtained by IHMT using vision data. A vision system presents a certain complementarity with inertial sensors. For the sake of simplicity and robustness, the vision system only tracks the objects but not the teacher. The correction is based on so-called Positions Of Interest (POIs) and involves 3 steps: the identification of the POIs in the inertial and vision data, the pairing of the hand POIs to objects POIs that correspond to the same action in the task, and finally, the correction of the hand trajectory based on the pairs of POIs. The complete method for demonstration acquisition is experimentally evaluated in a full PbD process. This experiment reveals the advantages of the proposed method over kinesthesy in the context of this work.La programmation par démonstration est une approche intuitive permettant de transmettre une tâche à un robot à partir d'une ou plusieurs démonstrations faites par un enseignant humain. L'acquisition des démonstrations nécessite cependant la résolution d'un problème de correspondance quand les systèmes sensitifs et moteurs de l'enseignant et de l'apprenant diffèrent. De nombreux travaux utilisent des démonstrations faites par kinesthésie, i.e., l'enseignant manipule directement le robot pour lui faire faire la tâche. Ce dernier enregistre ses mouvements grâce à ses propres encodeurs. De cette façon, le problème de correspondance est trivial. Lors de telles démonstrations, la dextérité de l'enseignant peut être altérée et impacter tout le processus de programmation par démonstration. Les méthodes d'acquisition de démonstration moins invalidantes pour l'enseignant nécessitent souvent des procédures spécifiques pour résoudre le problème de correspondance. Ainsi l'acquisition des démonstrations se base sur un compromis entre complexité de ces procédures, le niveau de précision des éléments enregistrés et la commodité pour l'enseignant. Cette thèse propose ainsi une méthode de mesure du mouvement humain par capteurs inertiels pour la programmation par démonstration de tâches de ``pick-and-place''. Les capteurs inertiels sont en effet pratiques et faciles à utiliser, mais sont d'une précision limitée. Nous étudions leur potentiel pour la programmation par démonstration. Pour estimer la trajectoire de la main de l'enseignant, des capteurs inertiels sont placés sur son bras, son avant-bras et sa main afin d'estimer leurs orientations. Une méthode est proposée afin de compenser partiellement la dérive de l'estimation de l'orientation des capteurs autour de la direction de la gravité. Cette méthode, appelée ``heading reset'', est basée sur l'hypothèse que le capteur passe plusieurs fois par son azimut initial avec des phases stationnaires lors d'une démonstration. Cette méthode est implémentée dans un algorithme d'intégration et d'observation de vecteur. Des expériences illustrent les avantages du ``heading reset''. Cette thèse développe ensuite une méthode complète de mesure des mouvements de la main humaine par capteurs inertiels (IHMT). Elle comprend une première procédure d'initialisation pour estimer l'orientation des capteurs par rapport aux segments du bras humain ainsi que l'orientation initiale des capteurs par rapport au repère de référence de l'humain. Cette procédure, consistant en une rotation et une position statique du bras tendu, est robuste au positionnement des capteurs. Une seconde procédure est proposée pour estimer la position de l'humain par rapport au robot et pour calibrer les paramètres de la méthode. Finalement, l'erreur moyenne sur la trajectoire de la main humaine est mesurée expérimentalement entre 28.5 mm et 61.8 mm, ce qui n'est cependant pas suffisant pour la programmation par démonstration. Afin d'atteindre le niveau de précision nécessaire, une nouvelle méthode est développée afin de corriger la trajectoire de la main par IHMT à partir de données issues d'un système de vision, complémentaire des capteurs inertiels. Pour maintenir une certaine simplicité et robustesse, le système de vision ne suit que les objets et pas l'enseignant. La méthode de correction, basée sur des ``Positions Of Interest (POIs)'', est constituée de 3 étapes: l'identification des POIs dans les données issues des capteurs inertiels et du système de vision, puis l'association de POIs liées à la main et de POIs liées aux objets correspondant à la même action, et enfin, la correction de la trajectoire de la main à partir des paires de POIs. Finalement, la méthode IHMT corrigée est expérimentalement évaluée dans un processus complet de programmation par démonstration. Cette expérience montre l'avantage de la méthode proposée sur la kinesthésie dans le contexte de ce travail
    corecore