5 research outputs found

    Towards comprehensive capture of human grasping and manipulation skills

    Get PDF
    Grasping plays a central role in our daily life. To interact with objects surrounding them, people use a large diversity of hand configurations in combination with forces ranging from the small ones involved in manipulating a pen for writing, to larger forces such as when drinking a cup full of water, and even larger ones such as when wielding a hammer. In this paper we present a setup to capture human hand configuration and motion as well as the forces applied by the hand on objects while performing a task. Hand configuration is obtained through the use of a data glove device while interaction forces are measured through an array of tactile sensors. Current approaches in the state-of-the-art are limited in that they only measure interaction forces on the fingers or the palm, ignoring the important role of the sides of the fingers in achieving a grasp/manipulation task. We propose a new setup for a “sensorized” data glove to address these limitations and through which a more complete picture of human hand response in grasping and manipulation can be obtained. This setup was successfully tested on five subjects performing a variety of different tasks

    Extracting data from human manipulation of objects towards improving autonomous robotic grasping

    No full text
    Humans excel in manipulation tasks, a basic skill for our survival and a key feature in our manmade world of artefacts and devices. In this work, we study how humans manipulate simple daily objects, and construct a probabilistic representation model for the tasks and objects useful for autonomous grasping and manipulation by robotic hands. Human demonstrations of predefined object manipulation tasks are recorded from both the human hand and object points of view. The multimodal data acquisition system records human gaze, hand and fingers 6D pose, finger flexure, tactile forces distributed on the inside of the hand, colour images and stereo depth map, and also object 6D pose and object tactile forces using instrumented objects. From the acquired data, relevant features are detected concerning motion patterns, tactile forces and hand-object states. This will enable modelling a class of tasks from sets of repeated demonstrations of the same task, so that a generalised probabilistic representation is derived to be used for task planning in artificial systems. An object centred probabilistic volumetric model is proposed to fuse the multimodal data and map contact regions, gaze, and tactile forces during stable grasps. This model is refined by segmenting the volume into components approximated by superquadrics, and overlaying the contact points used taking into account the task context. Results show that the features extracted are sufficient to distinguish key patterns that characterise each stage of the manipulation tasks, ranging from simple object displacement, where the same grasp is employed during manipulation (homogeneous manipulation) to more complex interactions such as object reorientation, fine positioning, and sequential in-hand rotation (dexterous manipulation). The framework presented retains the relevant data from human demonstrations, concerning both the manipulation and object characteristics, to be used by future grasp planning in artificial systems performing autonomous grasping

    A framework for digitisation of manual manufacturing task knowledge using gaming interface technology

    Get PDF
    Intense market competition and the global skill supply crunch are hurting the manufacturing industry, which is heavily dependent on skilled labour. Companies must look for innovative ways to acquire manufacturing skills from their experts and transfer them to novices and eventually to machines to remain competitive. There is a lack of systematic processes in the manufacturing industry and research for cost-effective capture and transfer of human skills. Therefore, the aim of this research is to develop a framework for digitisation of manual manufacturing task knowledge, a major constituent of which is human skill. The proposed digitisation framework is based on the theory of human-workpiece interactions that is developed in this research. The unique aspect of the framework is the use of consumer-grade gaming interface technology to capture and record manual manufacturing tasks in digital form to enable the extraction, decoding and transfer of manufacturing knowledge constituents that are associated with the task. The framework is implemented, tested and refined using 5 case studies, including 1 toy assembly task, 2 real-life-like assembly tasks, 1 simulated assembly task and 1 real-life composite layup task. It is successfully validated based on the outcomes of the case studies and a benchmarking exercise that was conducted to evaluate its performance. This research contributes to knowledge in five main areas, namely, (1) the theory of human-workpiece interactions to decipher human behaviour in manual manufacturing tasks, (2) a cohesive and holistic framework to digitise manual manufacturing task knowledge, especially tacit knowledge such as human action and reaction skills, (3) the use of low-cost gaming interface technology to capture human actions and the effect of those actions on workpieces during a manufacturing task, (4) a new way to use hidden Markov modelling to produce digital skill models to represent human ability to perform complex tasks and (5) extraction and decoding of manufacturing knowledge constituents from the digital skill models

    Diseño y proyecto de un dispositivo ortoprotésico de miembro superior

    Get PDF
    En un presente en el que las tecnologías están en expansión y se buscan las soluciones a grandes problemas, hay un vacío en el mercado en lo que respecta a los dispositivos ortoprotésicos de miembro superior, más conocidos como prótesis de miembro superior o prótesis de brazo o mano. Aunque existen varias opciones, se ha encontrado que todas las funcionales son muy caras, por lo que no son accesibles por una gran parte de los usuarios de este tipo de prótesis. A partir de un estudio del mercado actual, de las diferentes tecnologías y del usuario objetivo, se desarrolla una prótesis de mano mioeléctrica. Ésta funciona a través de impulsos eléctricos, emitidos por los músculos del usuario, que activan diferentes motores produciendo el movimiento de los dedos. Al final del trabajo, sin haber desarrollado la parte eléctrica, se llega al diseño de una prótesis de mano mioeléctrica con un encaje diseñado conceptualmente. La mano diseñada permite el movimiento independiente de los dedos pulgar, índice y corazón, además de un movimiento conjunto del anular y del meñique. En adición, ofrece un precio de producción bajo y permite: la opción de personalizarse estéticamente, la posibilidad de reemplazar piezas individuales sin necesidad de desechar todo el producto y una futura programación de los movimientos de los dedo
    corecore