7,583 research outputs found

    Multimodal barometric and inertial measurement unit based tactile sensor for robot control

    Get PDF
    In this article, we present a low-cost multimodal tactile sensor capable of providing accelerometer, gyroscope, and pressure data using a seven-axis chip as a sensing element. This approach reduces the complexity of the tactile sensor design and collection of multimodal data. The tactile device is composed of a top layer (a printed circuit board (PCB) and a sensing element), a middle layer (soft rubber material), and a bottom layer (plastic base) forming a sandwich structure. This approach allows the measurement of multimodal data when force is applied to different parts of the top layer of the sensor. The multimodal tactile sensor is validated with analyses and experiments in both offline and real-time. First, the spatial impulse response and sensitivity of the sensor are analyzed with accelerometer, gyroscope, and pressure data systematically collected from the sensor. Second, the estimation of contact location from a range of sensor positions and force values is evaluated using accelerometer and gyroscope data together with a convolutional neural network (CNN) method. Third, the estimation of contact location is used to control the position of a robot arm. The results show that the proposed multimodal tactile sensor has the potential for robotic applications, such as tactile perception for robot control, human-robot interaction, and object exploration.</p

    Multimodal barometric and inertial measurement unit based tactile sensor for robot control

    Get PDF
    In this article, we present a low-cost multimodal tactile sensor capable of providing accelerometer, gyroscope, and pressure data using a seven-axis chip as a sensing element. This approach reduces the complexity of the tactile sensor design and collection of multimodal data. The tactile device is composed of a top layer (a printed circuit board (PCB) and a sensing element), a middle layer (soft rubber material), and a bottom layer (plastic base) forming a sandwich structure. This approach allows the measurement of multimodal data when force is applied to different parts of the top layer of the sensor. The multimodal tactile sensor is validated with analyses and experiments in both offline and real-time. First, the spatial impulse response and sensitivity of the sensor are analyzed with accelerometer, gyroscope, and pressure data systematically collected from the sensor. Second, the estimation of contact location from a range of sensor positions and force values is evaluated using accelerometer and gyroscope data together with a convolutional neural network (CNN) method. Third, the estimation of contact location is used to control the position of a robot arm. The results show that the proposed multimodal tactile sensor has the potential for robotic applications, such as tactile perception for robot control, human-robot interaction, and object exploration.</p

    Wearable fingertip with touch, sliding and vibration feedback for immersive virtual reality

    Get PDF
    Wearable haptic technology plays a key role to enhance the feeling of immersion in virtual reality, telepresence, telehealth and entertainment systems. This work presents a wearable fingertip capable of providing touch, sliding and vibration feedback while the user interacts with virtual objects. This multimodal feedback is applied to the human fingertip using an array of servo motors, a coin vibration motor and 3D printed components. The wearable fingertip uses a 3D printed cylinder that moves up and down to provide touch feedback, and rotates in left and right directions to deliver sliding feedback. The direction of movement and speed of rotation of the cylinder are controlled by the exploration movements performed by the user hand and finger. Vibration feedback is generated using a coin vibration motor with the frequency controlled by the type of virtual material explored by the user. The Leap Motion module is employed to track the human hand and fingers to control the feedback delivered by the wearable device. This work is validated with experiments for exploration of virtual objects in Unity. The experiments show that this wearable haptic device offers an alternative platform with the potential of enhancing the feeling and experience of immersion in virtual reality environments, exploration of objects and telerobotics.</p

    Wearable fingertip with touch, sliding and vibration feedback for immersive virtual reality

    Get PDF
    Wearable haptic technology plays a key role to enhance the feeling of immersion in virtual reality, telepresence, telehealth and entertainment systems. This work presents a wearable fingertip capable of providing touch, sliding and vibration feedback while the user interacts with virtual objects. This multimodal feedback is applied to the human fingertip using an array of servo motors, a coin vibration motor and 3D printed components. The wearable fingertip uses a 3D printed cylinder that moves up and down to provide touch feedback, and rotates in left and right directions to deliver sliding feedback. The direction of movement and speed of rotation of the cylinder are controlled by the exploration movements performed by the user hand and finger. Vibration feedback is generated using a coin vibration motor with the frequency controlled by the type of virtual material explored by the user. The Leap Motion module is employed to track the human hand and fingers to control the feedback delivered by the wearable device. This work is validated with experiments for exploration of virtual objects in Unity. The experiments show that this wearable haptic device offers an alternative platform with the potential of enhancing the feeling and experience of immersion in virtual reality environments, exploration of objects and telerobotics.</p

    Towards an intuitive human-robot interaction based on hand gesture recognition and proximity sensors

    Get PDF
    In this paper, we present a multimodal sensor interface that is capable of recognizing hand gestures for human-robot interaction. The proposed system is composed of an array of proximity and gesture sensors, which have been mounted on a 3D printed bracelet. The gesture sensors are employed for data collection from four hand gesture movements (up, down, left and right) performed by the human at a predefined distance from the sensorised bracelet. The hand gesture movements are classified using Artificial Neural Networks. The proposed approach is validated with experiments in offline and real-time modes performed systematically. First, in offline mode, the accuracy for recognition of the four hand gesture movements achieved a mean of 97.86%. Second, the trained model was used for classification in real-time and achieved a mean recognition accuracy of 97.7%. The output from the recognised hand gesture in real-time mode was used to control the movement of a Universal Robot (UR3) arm in the CoppeliaSim simulation environment. Overall, the results from the experiments show that using multimodal sensors, together with computational intelligence methods, have the potential for the development of intuitive and safe human-robot interaction

    Multimodal sensor-based human-robot collaboration in assembly tasks

    Get PDF
    This work presents a framework for Human-Robot Collaboration (HRC) in assembly tasks that uses multimodal sensors, perception and control methods. First, vision sensing is employed for user identification to determine the collaborative task to be performed. Second, assembly actions and hand gestures are recognised using wearable inertial measurement units (IMUs) and convolutional neural networks (CNN) to identify when robot collaboration is needed and bring the next object to the user for assembly. If collaboration is not required, then the robot performs a solo task. Third, the robot arm uses time domain features from tactile sensors to detect when an object has been touched and grasped for handover actions in the assembly process. These multimodal sensors and computational modules are integrated in a layered control architecture for HRC collaborative assembly tasks. The proposed framework is validated in real-time using a Universal Robot arm (UR3) to collaborate with humans for assembling two types of objects 1) a box and 2) a small chair, and to work on a solo task of moving a stack of Lego blocks when collaboration with the user is not needed. The experiments show that the robot is capable of sensing and perceiving the state of the surrounding environment using multimodal sensors and computational methods to act and collaborate with humans to complete assembly tasks successfully.</p

    Multimodal sensor-based human-robot collaboration in assembly tasks

    Get PDF
    This work presents a framework for Human-Robot Collaboration (HRC) in assembly tasks that uses multimodal sensors, perception and control methods. First, vision sensing is employed for user identification to determine the collaborative task to be performed. Second, assembly actions and hand gestures are recognised using wearable inertial measurement units (IMUs) and convolutional neural networks (CNN) to identify when robot collaboration is needed and bring the next object to the user for assembly. If collaboration is not required, then the robot performs a solo task. Third, the robot arm uses time domain features from tactile sensors to detect when an object has been touched and grasped for handover actions in the assembly process. These multimodal sensors and computational modules are integrated in a layered control architecture for HRC collaborative assembly tasks. The proposed framework is validated in real-time using a Universal Robot arm (UR3) to collaborate with humans for assembling two types of objects 1) a box and 2) a small chair, and to work on a solo task of moving a stack of Lego blocks when collaboration with the user is not needed. The experiments show that the robot is capable of sensing and perceiving the state of the surrounding environment using multimodal sensors and computational methods to act and collaborate with humans to complete assembly tasks successfully.</p

    Stiffness and strength of stabilized organic soils—part i/ii: Experimental database and statistical description for machine learning modelling

    Get PDF
    This paper presents the experimental database and corresponding statistical analysis (Part I), which serves as a basis to perform the corresponding parametric analysis and machine learning modelling (Part II) of a comprehensive study on organic soil strength and stiffness, stabilized via the wet soil mixing method. The experimental database includes unconfined compression tests performed under laboratory-controlled conditions to investigate the impact of soil type, the soil’s organic content, the soil’s initial natural water content, binder type, binder quantity, grout to soil ratio, water to binder ratio, curing time, temperature, curing relative humidity and carbon dioxide content on the stabilized organic specimens’ stiffness and strength. A descriptive statistical analysis complements the description of the experimental database, along with a qualitative study on the stabilization hydration process via scanning electron microscopy images. Results confirmed findings on the use of Portland cement alone and a mix of Portland cement with ground granulated blast furnace slag as suitable binders for soil stabilization. Findings on mixes including lime and magnesium oxide cements demonstrated minimal stabilization. Specimen size affected stiffness, but not the strength for mixes of peat and Portland cement. The experimental database, along with all produced data analyses, are available at the Texas Data Repository as indicated in the Data Availability Statement below, to allow for data reproducibility and promote the use of artificial intelligence and machine learning competing modelling techniques as the ones presented in Part II of this paper.</jats:p
    • …
    corecore