490 research outputs found

    Towards a context-based Bayesian recognition of transitions in locomotion activities

    Get PDF

    Recognition of human activity and the state of an assembly task using vision and inertial sensor fusion methods

    Get PDF

    Recognition of human activity and the state of an assembly task using vision and inertial sensor fusion methods

    Get PDF
    Reliable human machine interfaces is key to accomplishing the goals of Industry 4.0. This work proposes the late fusion of a visual recognition and human action recognition (HAR) classifier. Vision is used to recognise the number of screws assembled into a mock part while HAR from body worn Inertial Measurement Units (IMUs) classifies actions done to assemble the part. Convolutional Neural Network (CNN) methods are used in both modes of classification before various late fusion methods are analysed for prediction of a final state estimate. The fusion methods investigated are mean, weighted average, Support Vector Machine (SVM), Bayesian, Artificial Neural Network (ANN) and Long Short Term Memory (LSTM). The results show the LSTM fusion method to perform best, with accuracy of 93% compared to 81% for IMU and 77% for visual sensing. Development of sensor fusion methods such as these is key to reliable Human Machine Interaction (HMI

    Multimodal barometric and inertial measurement unit based tactile sensor for robot control

    Get PDF

    Wearable fingertip with touch, sliding and vibration feedback for immersive virtual reality

    Get PDF
    Wearable haptic technology plays a key role to enhance the feeling of immersion in virtual reality, telepresence, telehealth and entertainment systems. This work presents a wearable fingertip capable of providing touch, sliding and vibration feedback while the user interacts with virtual objects. This multimodal feedback is applied to the human fingertip using an array of servo motors, a coin vibration motor and 3D printed components. The wearable fingertip uses a 3D printed cylinder that moves up and down to provide touch feedback, and rotates in left and right directions to deliver sliding feedback. The direction of movement and speed of rotation of the cylinder are controlled by the exploration movements performed by the user hand and finger. Vibration feedback is generated using a coin vibration motor with the frequency controlled by the type of virtual material explored by the user. The Leap Motion module is employed to track the human hand and fingers to control the feedback delivered by the wearable device. This work is validated with experiments for exploration of virtual objects in Unity. The experiments show that this wearable haptic device offers an alternative platform with the potential of enhancing the feeling and experience of immersion in virtual reality environments, exploration of objects and telerobotics.</p

    Multimodal barometric and inertial measurement unit based tactile sensor for robot control

    Get PDF
    In this article, we present a low-cost multimodal tactile sensor capable of providing accelerometer, gyroscope, and pressure data using a seven-axis chip as a sensing element. This approach reduces the complexity of the tactile sensor design and collection of multimodal data. The tactile device is composed of a top layer (a printed circuit board (PCB) and a sensing element), a middle layer (soft rubber material), and a bottom layer (plastic base) forming a sandwich structure. This approach allows the measurement of multimodal data when force is applied to different parts of the top layer of the sensor. The multimodal tactile sensor is validated with analyses and experiments in both offline and real-time. First, the spatial impulse response and sensitivity of the sensor are analyzed with accelerometer, gyroscope, and pressure data systematically collected from the sensor. Second, the estimation of contact location from a range of sensor positions and force values is evaluated using accelerometer and gyroscope data together with a convolutional neural network (CNN) method. Third, the estimation of contact location is used to control the position of a robot arm. The results show that the proposed multimodal tactile sensor has the potential for robotic applications, such as tactile perception for robot control, human-robot interaction, and object exploration.</p

    Predicted information gain and convolutional neural network for prediction of gait periods using a wearable sensors network

    Get PDF

    Prediction of gait events in walking activities with a Bayesian perception system

    Get PDF
    • …
    corecore