2,580 research outputs found

    Biologically-Inspired 3D Grasp Synthesis Based on Visual Exploration

    Get PDF
    Object grasping is a typical human ability which is widely studied from both a biological and an engineering point of view. This paper presents an approach to grasp synthesis inspired by the human neurophysiology of actionoriented vision. Our grasp synthesis method is built upon an architecture which, taking into account the differences between robotic and biological systems, proposes an adaptation of brain models to the peculiarities of robotic setups. The architecture modularity allows for scalability and integration of complex robotic tasks. The grasp synthesis is designed as integrated with the extraction of a 3D object description, so that the object visual analysis is actively driven by the needs of the grasp synthesis: visual reconstruction is performed incrementally and selectively on the regions of the object that are considered more interesting for graspin

    On Neuromechanical Approaches for the Study of Biological Grasp and Manipulation

    Full text link
    Biological and robotic grasp and manipulation are undeniably similar at the level of mechanical task performance. However, their underlying fundamental biological vs. engineering mechanisms are, by definition, dramatically different and can even be antithetical. Even our approach to each is diametrically opposite: inductive science for the study of biological systems vs. engineering synthesis for the design and construction of robotic systems. The past 20 years have seen several conceptual advances in both fields and the quest to unify them. Chief among them is the reluctant recognition that their underlying fundamental mechanisms may actually share limited common ground, while exhibiting many fundamental differences. This recognition is particularly liberating because it allows us to resolve and move beyond multiple paradoxes and contradictions that arose from the initial reasonable assumption of a large common ground. Here, we begin by introducing the perspective of neuromechanics, which emphasizes that real-world behavior emerges from the intimate interactions among the physical structure of the system, the mechanical requirements of a task, the feasible neural control actions to produce it, and the ability of the neuromuscular system to adapt through interactions with the environment. This allows us to articulate a succinct overview of a few salient conceptual paradoxes and contradictions regarding under-determined vs. over-determined mechanics, under- vs. over-actuated control, prescribed vs. emergent function, learning vs. implementation vs. adaptation, prescriptive vs. descriptive synergies, and optimal vs. habitual performance. We conclude by presenting open questions and suggesting directions for future research. We hope this frank assessment of the state-of-the-art will encourage and guide these communities to continue to interact and make progress in these important areas

    Puffy: A Step-by-step Guide to Craft Bio-inspired Artifacts with Interactive Materiality

    Full text link
    A rising number of HCI scholars have begun to use materiality as a starting point for exploring the design's potential and restrictions. Despite the theoretical flourishing, the practical design process and instruction for beginner practitioners are still in scarcity. We leveraged the pictorial format to illustrate our crafting process of Puffy, a bio-inspired artifact that features a cilia-mimetic surface expressing anthropomorphic qualities through shape changes. Our approach consists of three key activities (i.e., analysis, synthesis, and detailing) interlaced recursively throughout the journey. Using this approach, we analyzed different input sources, synthesized peers' critiques and self-reflection, and detailed the designed experience with iterative prototypes. Building on a reflective analysis of our approach, we concluded with a set of practical implications and design recommendations to inform other practitioners to initiate their investigations in interactive materiality.Comment: 17th International Conference On Tangible Embedded And Embodied Interactio

    Bio-Inspired Motion Strategies for a Bimanual Manipulation Task

    Get PDF
    Steffen JF, Elbrechter C, Haschke R, Ritter H. Bio-Inspired Motion Strategies for a Bimanual Manipulation Task. In: International Conference on Humanoid Robots (Humanoids). 2010

    Data-Driven Shape Analysis and Processing

    Full text link
    Data-driven methods play an increasingly important role in discovering geometric, structural, and semantic relationships between 3D shapes in collections, and applying this analysis to support intelligent modeling, editing, and visualization of geometric data. In contrast to traditional approaches, a key feature of data-driven approaches is that they aggregate information from a collection of shapes to improve the analysis and processing of individual shapes. In addition, they are able to learn models that reason about properties and relationships of shapes without relying on hard-coded rules or explicitly programmed instructions. We provide an overview of the main concepts and components of these techniques, and discuss their application to shape classification, segmentation, matching, reconstruction, modeling and exploration, as well as scene analysis and synthesis, through reviewing the literature and relating the existing works with both qualitative and numerical comparisons. We conclude our report with ideas that can inspire future research in data-driven shape analysis and processing.Comment: 10 pages, 19 figure

    Neurally Plausible Model of Robot Reaching Inspired by Infant Motor Babbling

    Get PDF
    In this dissertation, we present an abstract model of infant reaching that is neurally-plausible. This model is grounded in embodied artificial intelligence, which emphasizes the importance of the sensorimotor interaction of an agent and the world. It includes both learning sensorimotor correlations through motor babbling and also arm motion planning using spreading activation. We introduce a mechanism called bundle formation as a way to generalize motions during the motor babbling stage. We then offer a neural model for the abstract model, which is composed of three layers of neural maps with parallel structures representing the same sensorimotor space. The motor babbling period shapes the structure of the three neural maps as well as the connections within and between them; these connections encode trajectory bundles in the neural maps. We then investigate an implementation of the neural model using a reaching task on a humanoid robot. Through a set of experiments, we were able to find the best way to implement different components of this model such as motor babbling, neural representation of sensorimotor space, dimension reduction, path planning, and path execution. After the proper implementation had been found, we conducted another set of experiments to analyze the model and evaluate the planned motions. We evaluated unseen reaching motions using jerk, end effector error, and overshooting. In these experiments, we studied the effect of different dimensionalities of the reduced sensorimotor space, different bundle widths, and different bundle structures on the quality of arm motions. We hypothesized a larger bundle width would allow the model to generalize better. The results confirmed that the larger bundles lead to a smaller error of end-effector position for testing targets. An experiment with the resolution of neural maps showed that a neural map with a coarse resolution produces less smooth motions compared to a neural map with a fine resolution. We also compared the unseen reaching motions under different dimensionalities of the reduced sensorimotor space. The results showed that a smaller dimension leads to less smooth and accurate movements
    corecore