65 research outputs found

    The implications of embodiment for behavior and cognition: animal and robotic case studies

    Full text link
    In this paper, we will argue that if we want to understand the function of the brain (or the control in the case of robots), we must understand how the brain is embedded into the physical system, and how the organism interacts with the real world. While embodiment has often been used in its trivial meaning, i.e. 'intelligence requires a body', the concept has deeper and more important implications, concerned with the relation between physical and information (neural, control) processes. A number of case studies are presented to illustrate the concept. These involve animals and robots and are concentrated around locomotion, grasping, and visual perception. A theoretical scheme that can be used to embed the diverse case studies will be presented. Finally, we will establish a link between the low-level sensory-motor processes and cognition. We will present an embodied view on categorization, and propose the concepts of 'body schema' and 'forward models' as a natural extension of the embodied approach toward first representations.Comment: Book chapter in W. Tschacher & C. Bergomi, ed., 'The Implications of Embodiment: Cognition and Communication', Exeter: Imprint Academic, pp. 31-5

    Self-Organization in a Parametrically Coupled Logistic Map Network: A Model for Information Processing in the Visual Cortex

    Get PDF
    In this paper, a new model seeking to emulate the way the visual cortex processes information and interacts with subcortical areas to produce higher level brain functions is described. We developed a macroscopic approach that incorporates salient attributes of the cortex based on combining tools of nonlinear dynamics, information theory, and the known organizational and anatomical features of cortex. Justifications for this approach and demonstration of its effectiveness are presented. We also demonstrate certain capabilities of this model in producing efficient sparse representations and providing the cortical computational maps

    Self-Organization in a Parametrically Coupled Logistic Map Network: A Model for Information Processing in the Visual Cortex

    Full text link

    Dynamics Model Abstraction Scheme Using Radial Basis Functions

    Get PDF
    This paper presents a control model for object manipulation. Properties of objects and environmental conditions influence the motor control and learning. System dynamics depend on an unobserved external context, for example, work load of a robot manipulator. The dynamics of a robot arm change as it manipulates objects with different physical properties, for example, the mass, shape, or mass distribution. We address active sensing strategies to acquire object dynamical models with a radial basis function neural network (RBF). Experiments are done using a real robot's arm, and trajectory data are gathered during various trials manipulating different objects. Biped robots do not have high force joint servos and the control system hardly compensates all the inertia variation of the adjacent joints and disturbance torque on dynamic gait control. In order to achieve smoother control and lead to more reliable sensorimotor complexes, we evaluate and compare a sparse velocity-driven versus a dense position-driven control scheme

    A Comprehensive Workflow for General-Purpose Neural Modeling with Highly Configurable Neuromorphic Hardware Systems

    Full text link
    In this paper we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware-experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware-software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results

    Can the design of Space alter the stress response?

    Get PDF

    Event-based Vision: A Survey

    Get PDF
    Event cameras are bio-inspired sensors that differ from conventional frame cameras: Instead of capturing images at a fixed rate, they asynchronously measure per-pixel brightness changes, and output a stream of events that encode the time, location and sign of the brightness changes. Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur. Hence, event cameras have a large potential for robotics and computer vision in challenging scenarios for traditional cameras, such as low-latency, high speed, and high dynamic range. However, novel methods are required to process the unconventional output of these sensors in order to unlock their potential. This paper provides a comprehensive overview of the emerging field of event-based vision, with a focus on the applications and the algorithms developed to unlock the outstanding properties of event cameras. We present event cameras from their working principle, the actual sensors that are available and the tasks that they have been used for, from low-level vision (feature detection and tracking, optic flow, etc.) to high-level vision (reconstruction, segmentation, recognition). We also discuss the techniques developed to process events, including learning-based techniques, as well as specialized processors for these novel sensors, such as spiking neural networks. Additionally, we highlight the challenges that remain to be tackled and the opportunities that lie ahead in the search for a more efficient, bio-inspired way for machines to perceive and interact with the world

    Neuromorphic silicon neuron circuits

    Get PDF
    23 páginas, 21 figuras, 2 tablas.-- et al.Hardware implementations of spiking neurons can be extremely useful for a large variety of applications, ranging from high-speed modeling of large-scale neural systems to real-time behaving systems, to bidirectional brain–machine interfaces. The specific circuit solutions used to implement silicon neurons depend on the application requirements. In this paper we describe the most common building blocks and techniques used to implement these circuits, and present an overview of a wide range of neuromorphic silicon neurons, which implement different computational models, ranging from biophysically realistic and conductance-based Hodgkin–Huxley models to bi-dimensional generalized adaptive integrate and fire models. We compare the different design methodologies used for each silicon neuron design described, and demonstrate their features with experimental results, measured from a wide range of fabricated VLSI chips.This work was supported by the EU ERC grant 257219 (neuroP), the EU ICT FP7 grants 231467 (eMorph), 216777 (NABAB), 231168 (SCANDLE), 15879 (FACETS), by the Swiss National Science Foundation grant 119973 (SoundRec), by the UK EPSRC grant no. EP/C010841/1, by the Spanish grants (with support from the European Regional Development Fund) TEC2006-11730-C03-01 (SAMANTA2), TEC2009-10639-C04-01 (VULCANO) Andalusian grant num. P06TIC01417 (Brain System), and by the Australian Research Council grants num. DP0343654 and num. DP0881219.Peer Reviewe

    Architecture, Space and Information in Constructions Built by Humans and Social Insects: a Conceptual Review

    Get PDF
    The similarities between the structures built by social insects and by humans have led to a convergence of interests between biologists and architects. This new, de facto interdisciplinary community of scholars needs a common terminology and theoretical framework in which to ground its work. In this conceptually oriented review paper, we review the terms “information”, “space” and “architecture” to provide definitions that span biology and architecture. A framework is proposed on which interdisciplinary exchange may be better served, with the view that this will aid better cross fertilisation between disciplines, working in the areas of collective behaviour and analysis of the structures and edifices constructed by non-humans; and to facilitate how this area of study may better contribute to the field of architecture. We then use these definitions to discuss the informational content of constructions built by organisms and the influence these have on behaviour, and vice versa. We review how spatial constraints inform and influence interaction between an organism and its environment, and examine the reciprocity of space and information on construction and the behaviour of humans and social insects
    • …
    corecore