171 research outputs found

    Towards IMU-based Full-body Motion Estimation of Rough Terrain Mobile Manipulators

    Get PDF
    For navigation or pose estimation, strap-down Micro-Electro-Mechanical System (MEMS) Inertial Measurement Units (IMU) are widely used in all types of mobile devices and applications, from mobile phones to cars and heavy-duty Mobile Working Machines (MWM). This thesis is a summary of work focus on the utilization of IMUs for state estimation of MWM. Inertial sensor-based technology offers an alternative to the traditional solution, since it can significantly decrease the system cost and improve its robustness. For covering the research topic of whole-body estimation with IMUs, five publications focus on the development of novel algorithms, which use sensor fusion or rotary IMU theory to estimate or calculate the states of MWM. The test-platforms are also described in detail. First, we used low-cost IMUs installed on the surface of a hydraulic arm to estimate the joint state. These robotic arms are installed on a floating base, and the joints of the arms rotate in a two-dimensional (2D) plane. The novel algorithm uses an Extended Kalman Filter (EKF) to fuse the output of the gyroscopes and the accelerometers, with gravity as the reference. Second, a rotary gyroscope is mounted on a grasper of a crane, and the rotary gyroscope theory is implemented to decrease the drift of the angular velocity measurement. Third, low-cost IMUs are attached to the wheels and the bogie test bed, and the realization of IMU-based wheel odometry is investigated. Additionally, the rotary gyroscope provides information about the roll and yaw attitude for the test bed. Finally, we used an industry grade IMU fuse with the output of wheel odometry to estimate the position and attitude of the base for an MWM moving on slippery ground. One of the main aims of this research study is to estimate the states of an MWM only using IMU sensors. The research achievements indicate this approach is promising. However, the observability of IMU in the yaw direction of the navigation frame is limited so it is difficult to estimate the yaw angle of the rotation plane for the robotic arm when only using IMUs, to ensure the long-term reliable yaw angle and position of the vehicle base, external information might also be needed. When applying the rotary IMU theory, minimization of the power supply for the rotation device is still a challenge. This research study demonstrates that IMUs can be low-cost and reliable replacements for traditional sensors in joint angle measurement and in the wheel rotation angle for vehicles, among other applications. An IMU can also provide a robust state for a vehicle base in a challenging environment. These achievements will benefit future developments of MWMs in remote control and autonomous operations

    Design and Validation of a Portable Wireless Data Acquisition System for Measuring Human Joint Angles in Medical Applications

    Get PDF
    A prototype sensor system to capture and measure human joint movements in medical applications was developed. An algorithm that uses measurements from two IMU sensors to estimate the angle of one human joint was developed. Custom-made hardware and software were developed. Validation results showed 0.67° maximum error in static condition, 1.56° maximum RMSE for dynamic measurements and 2.5° average error during fast movements’ tests. The prototype has been successfully used by medical teams

    Fused mechanomyography and inertial measurement for human-robot interface

    Get PDF
    Human-Machine Interfaces (HMI) are the technology through which we interact with the ever-increasing quantity of smart devices surrounding us. The fundamental goal of an HMI is to facilitate robot control through uniting a human operator as the supervisor with a machine as the task executor. Sensors, actuators, and onboard intelligence have not reached the point where robotic manipulators may function with complete autonomy and therefore some form of HMI is still necessary in unstructured environments. These may include environments where direct human action is undesirable or infeasible, and situations where a robot must assist and/or interface with people. Contemporary literature has introduced concepts such as body-worn mechanical devices, instrumented gloves, inertial or electromagnetic motion tracking sensors on the arms, head, or legs, electroencephalographic (EEG) brain activity sensors, electromyographic (EMG) muscular activity sensors and camera-based (vision) interfaces to recognize hand gestures and/or track arm motions for assessment of operator intent and generation of robotic control signals. While these developments offer a wealth of future potential their utility has been largely restricted to laboratory demonstrations in controlled environments due to issues such as lack of portability and robustness and an inability to extract operator intent for both arm and hand motion. Wearable physiological sensors hold particular promise for capture of human intent/command. EMG-based gesture recognition systems in particular have received significant attention in recent literature. As wearable pervasive devices, they offer benefits over camera or physical input systems in that they neither inhibit the user physically nor constrain the user to a location where the sensors are deployed. Despite these benefits, EMG alone has yet to demonstrate the capacity to recognize both gross movement (e.g. arm motion) and finer grasping (e.g. hand movement). As such, many researchers have proposed fusing muscle activity (EMG) and motion tracking e.g. (inertial measurement) to combine arm motion and grasp intent as HMI input for manipulator control. However, such work has arguably reached a plateau since EMG suffers from interference from environmental factors which cause signal degradation over time, demands an electrical connection with the skin, and has not demonstrated the capacity to function out of controlled environments for long periods of time. This thesis proposes a new form of gesture-based interface utilising a novel combination of inertial measurement units (IMUs) and mechanomyography sensors (MMGs). The modular system permits numerous configurations of IMU to derive body kinematics in real-time and uses this to convert arm movements into control signals. Additionally, bands containing six mechanomyography sensors were used to observe muscular contractions in the forearm which are generated using specific hand motions. This combination of continuous and discrete control signals allows a large variety of smart devices to be controlled. Several methods of pattern recognition were implemented to provide accurate decoding of the mechanomyographic information, including Linear Discriminant Analysis and Support Vector Machines. Based on these techniques, accuracies of 94.5% and 94.6% respectively were achieved for 12 gesture classification. In real-time tests, accuracies of 95.6% were achieved in 5 gesture classification. It has previously been noted that MMG sensors are susceptible to motion induced interference. The thesis also established that arm pose also changes the measured signal. This thesis introduces a new method of fusing of IMU and MMG to provide a classification that is robust to both of these sources of interference. Additionally, an improvement in orientation estimation, and a new orientation estimation algorithm are proposed. These improvements to the robustness of the system provide the first solution that is able to reliably track both motion and muscle activity for extended periods of time for HMI outside a clinical environment. Application in robot teleoperation in both real-world and virtual environments were explored. With multiple degrees of freedom, robot teleoperation provides an ideal test platform for HMI devices, since it requires a combination of continuous and discrete control signals. The field of prosthetics also represents a unique challenge for HMI applications. In an ideal situation, the sensor suite should be capable of detecting the muscular activity in the residual limb which is naturally indicative of intent to perform a specific hand pose and trigger this post in the prosthetic device. Dynamic environmental conditions within a socket such as skin impedance have delayed the translation of gesture control systems into prosthetic devices, however mechanomyography sensors are unaffected by such issues. There is huge potential for a system like this to be utilised as a controller as ubiquitous computing systems become more prevalent, and as the desire for a simple, universal interface increases. Such systems have the potential to impact significantly on the quality of life of prosthetic users and others.Open Acces

    Human-Inspired Balancing and Recovery Stepping for Humanoid Robots

    Get PDF
    Robustly maintaining balance on two legs is an important challenge for humanoid robots. The work presented in this book represents a contribution to this area. It investigates efficient methods for the decision-making from internal sensors about whether and where to step, several improvements to efficient whole-body postural balancing methods, and proposes and evaluates a novel method for efficient recovery step generation, leveraging human examples and simulation-based reinforcement learning

    TeLeMan: Teleoperation for Legged Robot Loco-Manipulation using Wearable IMU-based Motion Capture

    Get PDF
    Human life is invaluable. When dangerous or life-threatening tasks need to be completed, robotic platforms could be ideal in replacing human operators. Such a task that we focus on in this work is the Explosive Ordnance Disposal. Robot telepresence has the potential to provide safety solutions, given that mobile robots have shown robust capabilities when operating in several environments. However, autonomy may be challenging and risky at this stage, compared to human operation. Teleoperation could be a compromise between full robot autonomy and human presence. In this paper, we present a relatively cheap solution for telepresence and robot teleoperation, to assist with Explosive Ordnance Disposal, using a legged manipulator (i.e., a legged quadruped robot, embedded with a manipulator and RGB-D sensing). We propose a novel system integration for the non-trivial problem of quadruped manipulator whole-body control. Our system is based on a wearable IMU-based motion capture system that is used for teleoperation and a VR headset for visual telepresence. We experimentally validate our method in real-world, for loco-manipulation tasks that require whole-body robot control and visual telepresence

    Human motion estimation and controller learning

    Get PDF
    Humans are capable of complex manipulation and locomotion tasks. They are able to achieve energy-efficient gait, reject disturbances, handle changing loads, and adapt to environmental constraints. Using inspiration from the human body, robotics researchers aim to develop systems with similar capabilities. Research suggests that humans minimize a task specific cost function when performing movements. In order to learn this cost function from demonstrations and incorporate it into a controller, it is first imperative to accurately estimate the expert motion. The captured motions can then be analyzed to extract the objective function the expert was minimizing. We propose a framework for human motion estimation from wearable sensors. Human body joints are modeled by matrix Lie groups, using special orthogonal groups SO(2) and SO(3) for joint pose and special Euclidean group SE(3) for base link pose representation. To estimate the human joint pose, velocity and acceleration, we provide the equations for employing the extended Kalman Filter on Lie Groups, thus explicitly accounting for the non-Euclidean geometry of the state space. Incorporating interaction constraints with respect to the environment or within the participant allows us to track global body position without an absolute reference and ensure viable pose estimate. The algorithms are extensively validated in both simulation and real-world experiments. Next, to learn underlying expert control strategies from the expert demonstrations we present a novel fast approximate multi-variate Gaussian Process regression. The method estimates the underlying cost function, without making assumptions on its structure. The computational efficiency of the approach allows for real time forward horizon prediction. Using a linear model predictive control framework we then reproduce the demonstrated movements on a robot. The learned cost function captures the variability in expert motion as well as the correlations between states, leading to a controller that both produces motions and reacts to disturbances in a human-like manner. The model predictive control formulation allows the controller to satisfy task and joint space constraints avoiding obstacles and self collisions, as well as torque constraints, ensuring operational feasibility. The approach is validated on the Franka Emika robot using real human motion exemplars

    Improving Dynamics Estimations and Low Level Torque Control Through Inertial Sensing

    Get PDF
    In 1996, professors J. Edward Colgate and Michael Peshkin invented the cobots as robotic equipment safe enough for interacting with human workers. Twenty years later, collaborative robots are highly demanded in the packaging industry, and have already been massively adopted by companies facing issues for meeting customer demands. Meantime, cobots are still making they way into environments where value-added tasks require more complex interactions between robots and human operators. For other applications like a rescue mission in a disaster scenario, robots have to deal with highly dynamic environments and uneven terrains. All these applications require robust, fine and fast control of the interaction forces, specially in the case of locomotion on uneven terrains in an environment where unexpected events can occur. Such interaction forces can only be modulated through the control of joint internal torques in the case of under-actuated systems which is typically the case of mobile robots. For that purpose, an efficient low level joint torque control is one of the critical requirements, and motivated the research presented here. This thesis addresses a thorough model analysis of a typical low level joint actuation sub-system, powered by a Brushless DC motor and suitable for torque control. It then proposes procedure improvements in the identification of model parameters, particularly challenging in the case of coupled joints, in view of improving their control. Along with these procedures, it proposes novel methods for the calibration of inertial sensors, as well as the use of such sensors in the estimation of joint torques

    Ergowear: desenvolvimento de um vestuário inteligente para monitorização postural e biofeedback

    Get PDF
    Dissertação de mestrado em Engenharia Biomédica (especialização em Eletrónica Médica)Atualmente, as Lesões Musculoesqueléticas Relacionadas com o Trabalho (LMERT) são considera das o ”problema relacionado com o trabalho mais prevalente”na União Europeia, levando a um custo estimado de cerca de 240 biliões de euros. Em casos mais severos, estes distúrbios podem causar danos vitalícios à saúde do trabalhador, reduzindo a sua qualidade de vida. De facto, LMERTs são con sideradas a principal causa da reforma precoce dos trabalhadores. Foi reportado que os segmentos da parte superior do corpo são mais suceptíveis ao desenvolvimento de LMERTs. Para mitigar a prevalência de LMERTs, ergonomistas maioritariamente aplicam métodos de avaliação observacionais, que são alta mente dependentes da experiência do analista, e apresentam baixa objetividade e repetibilidade. Desta maneira, esforços têm sido feitos para desenvolver ferramentas de avaliação ergonómica baseadas na instrumentação, para compensar essas limitações. Além disso, com a ascensão do conceito da indústria 5.0, o trabalhador humano volta a ser o foco principal na indústria, juntamente com o robô colaborativo. No entanto, para alcançar uma relação verdadeiramente colaborativa e simbiótica entre o trabalhador e o robô, este último precisa de reconhecer as intenções do trabalhador. Para superar este obstáculo, sis temas de captura de movimento podem ser integrados nesta estrutura, fornecendo dados de movimento ao robô colaborativo. Esta dissertação visa a melhoria de um sistema de captura de movimento autónomo, da parte supe rior do corpo, de abordagem inercial que servirá, não apenas para monitorizar a postura do trabalhador, mas também avaliar a ergonomia do usuário e fornecer consciencialização postural ao usuário, por meio de motores de biofeedback. Além disso, o sistema foi já idealizado tendo em mente a sua integração numa estrutura colaborativa humano-robô. Para atingir estes objetivos, foi aplicada uma metodologia de design centrado no utilizador, começando pela análise do Estado da Arte, a avaliação das limitações do sistema anterior, a definição dos requisitos do sistema, o desenvolvimento da peça de vestuário, arquite tura do hardware e arquitetura do software do sistema. Por fim, o sistema foi validado para verificar se estava em conformidade com os requisitos especificados. O sistema é composto por 9 Unidades de Medição Inercial (UMI), posicionados na parte inferior e superior das costas, cabeça, braços, antebraços e mãos. Também foi integrado um sistema de atuação, para biofeedback postural, composto por 6 motores vibrotáteis, localizados na região lombar e próximo do pescoço, cotovelos e pulsos. O sistema é alimentado por uma powerbank e todos os dados adquiridos são enviados para uma estação de processamento, via WiFi (User Datagram Protocol (UDP)), garantindo autonomia. O sistema tem integrado um filtro de fusão Complementar Extendido e uma sequência de calibração Sensor-para-Segmento estática, de maneira a aumentar a precisão da estimativa dos ângulos das articulações. Além disso, o sistema é capaz de amostrar os dados angulares a 240 Hz, enquanto que o sistema anterior era capaz de amostrar no máximo a 100 Hz, melhorando a resolução da aquisição dos dados. O sistema foi validado em termos de hardware e usabilidade. Os testes de hardware abordaram a caracterização da autonomia, frequência de amostragem, robustez mecânica e desempenho da comuni cação sem fio do sistema, em diversos contextos, e também para verificar se estes estão em conformidade com os requisitos técnicos previamente definidos, que foi o caso. Adicionalmente, as especificações da nova versão do sistema foram comparadas com a anterior, onde se observou uma melhoria direta signifi cativa, como por exemplo, maior frequência de amostragem, menor perda de pacote, menor consumo de corrente, entre outras, e com sistemas comerciais de referência (XSens Link). Testes de usabilidade foram realizados com 9 participantes que realizaram vários movimentos uniarticulares e complexos. Após os testes, os usuários responderam a um questionário baseado na Escala de Usabilidade do Sistema (EUS). O sistema foi bem aceite pelos os usuários, em termos de estética e conforto, em geral, comprovando um elevado nível de vestibilidade.Nowadays, Work-Related Musculoskeletal Disorders (WRMSDs) are considered the ”most prevalent work-related problem” in the European Union (EU), leading to an estimated cost of about 240 billion EUR. In more severe cases, these disorders can cause life-long impairments to the workers’ health, reducing their quality of life. In fact, WRMSDs are the main cause for the workers’ early retirement. It was reported that the upper body segments of the worker are more susceptible to the development of WRMSDs. To mitigate the prevalence of WRMSD, ergonomists mostly apply observational assessment methods, which are highly dependant on the analyst’s expertise, have low objectivity and repeatability. Therefore, efforts have been made to develop instrumented-based ergonomic assessment tools, to compensate for these limitations. Moreover, with the rise of the 5.0 industry concept, the human worker is once again the main focus in the industry, along with the Collaborative Robot (cobot). However, to achieve a truly collaborative relation between the worker and the cobot, the latter needs to know the worker’s intentions. To surpass this obstacle, Motion Capture (MoCap) systems can be integrated in this framework, providing motion data to the cobot. This dissertation aims at the improvement of a stand-alone, upper-body, inertial, MoCap system, that will serve to not only monitor the worker’s posture, but also to assess the user’s ergonomics and provide posture awareness to the user, through biofeedback motors. Furthermore, it was also designed to integrate a human-robot collaborative framework. To achieve this, a user-centred design methodology was applied, starting with analyzing the State of Art (SOA), assessing the limitations of the previous system, defining the system’s requirements, developing the garment, hardware architecture and software architecture of the system. Lastly, the system was validated to ascertain if it is in conformity with the specified requirements. The developed system is composed of 9 Inertial Measurement Units (IMUs), placed on the lower and upper back, head, upper arms, forearms and hands. An actuation system was also integrated, for postural biofeedback, and it is comprised of 6 vibrotactile motors, located in the lower back, and in close proximity to the neck, elbows and wrists. The system is powered by a powerbank and all of the acquired data is sent to a main station, via WiFi (UDP), granting a standalone characteristic. The system integrates an Extended Complementary Filter (ECF) and a static Sensor-to-Segment (STS) calibration sequence to increase the joint angle estimation accuracy. Furthermore, the system is able to sample the angular data at 240 Hz, while the previous system was able to sample it at a maximum 100 Hz, improving the resolution of the data acquisition. The system was validated in terms of hardware and usability. The hardware tests addressed the char acterization of the system’s autonomy, sampling frequency, mechanical robustness and wireless commu nication performance in different contexts, and ascertain if they comply with the technical requirements, which was the case. Moreover, the specifications of the new version were compared with the previous one, where a significant direct improvement was observed, such as, higher sampling frequency, lower packet loss, lower current consumption, among others, and with a commercial system of reference (XSens Link). Usability tests were carried out with 9 participants who performed several uni-joint and complex motions. After testing, users answered a questionnaire based on the System Usability Scale (SUS). The system was very well accepted by the participants, regarding aesthetics and overall comfort, proving to have a high level of wearability

    Safe local aerial manipulation for the installation of devices on power lines: Aerial-core first year results and designs

    Get PDF
    Article number 6220The power grid is an essential infrastructure in any country, comprising thousands of kilometers of power lines that require periodic inspection and maintenance, carried out nowadays by human operators in risky conditions. To increase safety and reduce time and cost with respect to conventional solutions involving manned helicopters and heavy vehicles, the AERIAL-CORE project proposes the development of aerial robots capable of performing aerial manipulation operations to assist human operators in power lines inspection and maintenance, allowing the installation of devices, such as bird flight diverters or electrical spacers, and the fast delivery and retrieval of tools. This manuscript describes the goals and functionalities to be developed for safe local aerial manipulation, presenting the preliminary designs and experimental results obtained in the first year of the project.European Union (UE). H2020 871479Ministerio de Ciencia, Innovación y Universidades de España FPI 201
    corecore