43,468 research outputs found

    Sensorimotor coordination and metastability in a situated HKB model

    Get PDF
    Oscillatory phenomena are ubiquitous in nature and have become particularly relevant for the study of brain and behaviour. One of the simplest, yet explanatorily powerful, models of oscillatory Coordination Dynamics is the Haken–Kelso–Bunz (HKB) model. The metastable regime described by the HKB equation has been hypothesised to be the signature of brain oscillatory dynamics underlying sensorimotor coordination. Despite evidence supporting such a hypothesis, to our knowledge, there are still very few models (if any) where the HKB equation generates spatially situated behaviour and, at the same time, has its dynamics modulated by the behaviour it generates (by means of the sensory feedback resulting from body movement). This work presents a computational model where the HKB equation controls an agent performing a simple gradient climbing task and shows (i) how different metastable dynamical patterns in the HKB equation are generated and sustained by the continuous interaction between the agent and its environment; and (ii) how the emergence of functional metastable patterns in the HKB equation – i.e. patterns that generate gradient climbing behaviour – depends not only on the structure of the agent's sensory input but also on the coordinated coupling of the agent's motor–sensory dynamics. This work contributes to Kelso's theoretical framework and also to the understanding of neural oscillations and sensorimotor coordination

    Beyond Gazing, Pointing, and Reaching: A Survey of Developmental Robotics

    Get PDF
    Developmental robotics is an emerging field located at the intersection of developmental psychology and robotics, that has lately attracted quite some attention. This paper gives a survey of a variety of research projects dealing with or inspired by developmental issues, and outlines possible future directions

    The implications of embodiment for behavior and cognition: animal and robotic case studies

    Full text link
    In this paper, we will argue that if we want to understand the function of the brain (or the control in the case of robots), we must understand how the brain is embedded into the physical system, and how the organism interacts with the real world. While embodiment has often been used in its trivial meaning, i.e. 'intelligence requires a body', the concept has deeper and more important implications, concerned with the relation between physical and information (neural, control) processes. A number of case studies are presented to illustrate the concept. These involve animals and robots and are concentrated around locomotion, grasping, and visual perception. A theoretical scheme that can be used to embed the diverse case studies will be presented. Finally, we will establish a link between the low-level sensory-motor processes and cognition. We will present an embodied view on categorization, and propose the concepts of 'body schema' and 'forward models' as a natural extension of the embodied approach toward first representations.Comment: Book chapter in W. Tschacher & C. Bergomi, ed., 'The Implications of Embodiment: Cognition and Communication', Exeter: Imprint Academic, pp. 31-5

    Neural development and sensorimotor control

    Get PDF
    What is the relationship between development of the nervous system and the emergence of voluntary motor behavior? This is the central question of the nature-nurture discussion that has intrigued child psychologists and pediatric neurologists for decades. This paper attempts to revisit this issue. Recent empirical evidence on how infants acquire multi-joint coordination and how children learn to adapt to novel force environments will be discussed with reference to the underlying development of the nervous system. The claim will be made that the developing human nervous system by no means constitutes an ideal controller. However, its redundancy, its ability to integrate multi-modal sensory information and motor commands and its facility of time-critical neural plasticity are features that may prove to be useful for the design of adaptive robots

    Action-based effects on music perception

    Get PDF
    The classical, disembodied approach to music cognition conceptualizes action and perception as separate, peripheral processes. In contrast, embodied accounts of music cognition emphasize the central role of the close coupling of action and perception. It is a commonly established fact that perception spurs action tendencies. We present a theoretical framework that captures the ways in which the human motor system and its actions can reciprocally influence the perception of music. The cornerstone of this framework is the common coding theory, postulating a representational overlap in the brain between the planning, the execution, and the perception of movement. The integration of action and perception in so-called internal models is explained as a result of associative learning processes. Characteristic of internal models is that they allow intended or perceived sensory states to be transferred into corresponding motor commands (inverse modeling), and vice versa, to predict the sensory outcomes of planned actions (forward modeling). Embodied accounts typically refer to inverse modeling to explain action effects on music perception (Leman, 2007). We extend this account by pinpointing forward modeling as an alternative mechanism by which action can modulate perception. We provide an extensive overview of recent empirical evidence in support of this idea. Additionally, we demonstrate that motor dysfunctions can cause perceptual disabilities, supporting the main idea of the paper that the human motor system plays a functional role in auditory perception. The finding that music perception is shaped by the human motor system and its actions suggests that the musical mind is highly embodied. However, we advocate for a more radical approach to embodied (music) cognition in the sense that it needs to be considered as a dynamical process, in which aspects of action, perception, introspection, and social interaction are of crucial importance

    Quantifying the Evolutionary Self Structuring of Embodied Cognitive Networks

    Full text link
    We outline a possible theoretical framework for the quantitative modeling of networked embodied cognitive systems. We notice that: 1) information self structuring through sensory-motor coordination does not deterministically occur in Rn vector space, a generic multivariable space, but in SE(3), the group structure of the possible motions of a body in space; 2) it happens in a stochastic open ended environment. These observations may simplify, at the price of a certain abstraction, the modeling and the design of self organization processes based on the maximization of some informational measures, such as mutual information. Furthermore, by providing closed form or computationally lighter algorithms, it may significantly reduce the computational burden of their implementation. We propose a modeling framework which aims to give new tools for the design of networks of new artificial self organizing, embodied and intelligent agents and the reverse engineering of natural ones. At this point, it represents much a theoretical conjecture and it has still to be experimentally verified whether this model will be useful in practice.

    Muscle synergies in neuroscience and robotics: from input-space to task-space perspectives

    Get PDF
    In this paper we review the works related to muscle synergies that have been carried-out in neuroscience and control engineering. In particular, we refer to the hypothesis that the central nervous system (CNS) generates desired muscle contractions by combining a small number of predefined modules, called muscle synergies. We provide an overview of the methods that have been employed to test the validity of this scheme, and we show how the concept of muscle synergy has been generalized for the control of artificial agents. The comparison between these two lines of research, in particular their different goals and approaches, is instrumental to explain the computational implications of the hypothesized modular organization. Moreover, it clarifies the importance of assessing the functional role of muscle synergies: although these basic modules are defined at the level of muscle activations (input-space), they should result in the effective accomplishment of the desired task. This requirement is not always explicitly considered in experimental neuroscience, as muscle synergies are often estimated solely by analyzing recorded muscle activities. We suggest that synergy extraction methods should explicitly take into account task execution variables, thus moving from a perspective purely based on input-space to one grounded on task-space as well
    corecore