36 research outputs found

    A Tactile-based Fabric Learning and Classification Architecture

    Get PDF
    This paper proposes an architecture for tactile-based fabric learning and classification. The architecture is based on a number of SVM-based learning units, which we call fabric classification cores, specifically trained to discriminate between two fabrics. Each core is based on a specific subset of the fully available set of features, on the basis of their discriminative value, determined using the p-value. During fabric recognition, each core casts a vote. The architecture collects votes and provides an overall classification result. We tested seventeen different fabrics, and the result showed that classification errors are negligible

    On the manipulation of articulated objects in human-robot cooperation scenarios

    Get PDF
    Articulated and flexible objects constitute a challenge for robot manipulation tasks, but are present in different real-world settings, including home and industrial environments. Approaches to the manipulation of such objects employ ad hoc strategies to sequence and perform actions on them depending on their physical or geometrical features, and on a priori target object configurations, whereas principled strategies to sequence basic manipulation actions for these objects have not been fully explored in the literature. In this paper, we propose a novel action planning and execution framework for the manipulation of articulated objects, which (i) employs action planning to sequence a set of actions leading to a target articulated object configuration, and (ii) allows humans to collaboratively carry out the plan with the robot, also interrupting its execution if needed. The framework adopts a formally defined representation of articulated objects. A link exists between the way articulated objects are perceived by the robot, how they are formally represented in the action planning and execution framework, and the complexity of the planning process. Results related to planning performance, and examples with a Baxter dualarm manipulator operating on articulated objects with humans are shown

    Cognition-enabled robotic wiping: Representation, planning, execution, and interpretation

    Get PDF
    Advanced cognitive capabilities enable humans to solve even complex tasks by representing and processing internal models of manipulation actions and their effects. Consequently, humans are able to plan the effect of their motions before execution and validate the performance afterwards. In this work, we derive an analog approach for robotic wiping actions which are fundamental for some of the most frequent household chores including vacuuming the floor, sweeping dust, and cleaning windows. We describe wiping actions and their effects based on a qualitative particle distribution model. This representation enables a robot to plan goal-oriented wiping motions for the prototypical wiping actions of absorbing, collecting and skimming. The particle representation is utilized to simulate the task outcome before execution and infer the real performance afterwards based on haptic perception. This way, the robot is able to estimate the task performance and schedule additional motions if necessary. We evaluate our methods in simulated scenarios, as well as in real experiments with the humanoid service robot Rollin’ Justin

    Tactile sensing: steps to artificial somatosensory maps

    No full text
    In this paper a framework for representing tactile information in robots is discussed. Control models exploiting tactile sensing are fundamental in social Human-Robot interaction tasks. Difficulties arising in rendering the sense of touch in robots are at different levels: both representation and computational issues must be considered. A layered system is proposed, which is inspired from tactile sensing in humans for building artificial somatosensory maps in robots. Experiments in simulation are used to validate the approach

    Human hand recognition from robotic skin measurements in human-robot physical interactions

    No full text
    This paper presents the design and development of a novel tactile sensor system for clothes manipulation and perception using industrial grippers. The proposed system consists of a multi-modal tactile sensor and embedded re-programmable interface electronics. The sensing element features a matrix of capacitive pressure sensors, a microphone for acoustic measurements, and a proximity and ambient light sensor. The sensor is fully embedded and can be easily integrated at mechanical and electrical levels with industrial grippers. Tactile sensing design has been put on the same level of additional requirements, such as the mechanical interface, cable harness, and robustness against continuous and repetitive operations, just to name but a few. The performances of the different sensing modalities have been experimentally assessed using a test rig for tactile sensors and the system has been successfully integrated and tested on a real robotic gripper. Experiments have been performed in order to show the capabilities of the sensor for implementing tactile-based industrial gripper control

    Enabling natural human-robot physical interaction using a robotic skin feedback and a prioritized tasks robot control architecture

    No full text
    This paper describes a procedure aimed to integrate tactile sensors into a real robot in order to create a platform suitable for human-robot physical interaction experiments. Furthermore, a framework for human-robot physical interaction based on tactile feedback and prioritized tasks control is presented. The framework has been successfully tested by defining and executing three physical interaction tasks. A further experiment has been performed, simulating a human intervention during a task execution

    Contact-based robot control through tactile maps

    No full text
    corecore