7,249 research outputs found

    Incremental embodied chaotic exploration of self-organized motor behaviors with proprioceptor adaptation

    Get PDF
    This paper presents a general and fully dynamic embodied artificial neural system, which incrementally explores and learns motor behaviors through an integrated combination of chaotic search and reflex learning. The former uses adaptive bifurcation to exploit the intrinsic chaotic dynamics arising from neuro-body-environment interactions, while the latter is based around proprioceptor adaptation. The overall iterative search process formed from this combination is shown to have a close relationship to evolutionary methods. The architecture developed here allows realtime goal-directed exploration and learning of the possible motor patterns (e.g., for locomotion) of embodied systems of arbitrary morphology. Examples of its successful application to a simple biomechanical model, a simulated swimming robot, and a simulated quadruped robot are given. The tractability of the biomechanical systems allows detailed analysis of the overall dynamics of the search process. This analysis sheds light on the strong parallels with evolutionary search

    SOVEREIGN: An Autonomous Neural System for Incrementally Learning Planned Action Sequences to Navigate Towards a Rewarded Goal

    Full text link
    How do reactive and planned behaviors interact in real time? How are sequences of such behaviors released at appropriate times during autonomous navigation to realize valued goals? Controllers for both animals and mobile robots, or animats, need reactive mechanisms for exploration, and learned plans to reach goal objects once an environment becomes familiar. The SOVEREIGN (Self-Organizing, Vision, Expectation, Recognition, Emotion, Intelligent, Goaloriented Navigation) animat model embodies these capabilities, and is tested in a 3D virtual reality environment. SOVEREIGN includes several interacting subsystems which model complementary properties of cortical What and Where processing streams and which clarify similarities between mechanisms for navigation and arm movement control. As the animat explores an environment, visual inputs are processed by networks that are sensitive to visual form and motion in the What and Where streams, respectively. Position-invariant and sizeinvariant recognition categories are learned by real-time incremental learning in the What stream. Estimates of target position relative to the animat are computed in the Where stream, and can activate approach movements toward the target. Motion cues from animat locomotion can elicit head-orienting movements to bring a new target into view. Approach and orienting movements are alternately performed during animat navigation. Cumulative estimates of each movement are derived from interacting proprioceptive and visual cues. Movement sequences are stored within a motor working memory. Sequences of visual categories are stored in a sensory working memory. These working memories trigger learning of sensory and motor sequence categories, or plans, which together control planned movements. Predictively effective chunk combinations are selectively enhanced via reinforcement learning when the animat is rewarded. Selected planning chunks effect a gradual transition from variable reactive exploratory movements to efficient goal-oriented planned movement sequences. Volitional signals gate interactions between model subsystems and the release of overt behaviors. The model can control different motor sequences under different motivational states and learns more efficient sequences to rewarded goals as exploration proceeds.Riverside Reserach Institute; Defense Advanced Research Projects Agency (N00014-92-J-4015); Air Force Office of Scientific Research (F49620-92-J-0225); National Science Foundation (IRI 90-24877, SBE-0345378); Office of Naval Research (N00014-92-J-1309, N00014-91-J-4100, N00014-01-1-0624, N00014-01-1-0624); Pacific Sierra Research (PSR 91-6075-2

    Spatial encoding in primate hippocampus during free navigation.

    Get PDF
    The hippocampus comprises two neural signals-place cells and θ oscillations-that contribute to facets of spatial navigation. Although their complementary relationship has been well established in rodents, their respective contributions in the primate brain during free navigation remains unclear. Here, we recorded neural activity in the hippocampus of freely moving marmosets as they naturally explored a spatial environment to more explicitly investigate this issue. We report place cells in marmoset hippocampus during free navigation that exhibit remarkable parallels to analogous neurons in other mammalian species. Although θ oscillations were prevalent in the marmoset hippocampus, the patterns of activity were notably different than in other taxa. This local field potential oscillation occurred in short bouts (approximately .4 s)-rather than continuously-and was neither significantly modulated by locomotion nor consistently coupled to place-cell activity. These findings suggest that the relationship between place-cell activity and θ oscillations in primate hippocampus during free navigation differs substantially from rodents and paint an intriguing comparative picture regarding the neural basis of spatial navigation across mammals

    Enkinaesthetic polyphony: the underpinning for first-order languaging

    Get PDF
    We contest two claims: (1) that language, understood as the processing of abstract symbolic forms, is an instrument of cognition and rational thought, and (2) that conventional notions of turn-taking, exchange structure, and move analysis, are satisfactory as a basis for theorizing communication between living, feeling agents. We offer an enkinaesthetic theory describing the reciprocal affective neuro-muscular dynamical flows and tensions of co- agential dialogical sense-making relations. This “enkinaesthetic dialogue” is characterised by a preconceptual experientially recursive temporal dynamics forming the deep extended melodies of relationships in time. An understanding of how those relationships work, when we understand and are ourselves understood, when communication falters and conflict arises, will depend on a grasp of our enkinaesthetic intersubjectivity

    A Neural Model of Visually Guided Steering, Obstacle Avoidance, and Route Selection

    Full text link
    A neural model is developed to explain how humans can approach a goal object on foot while steering around obstacles to avoid collisions in a cluttered environment. The model uses optic flow from a 3D virtual reality environment to determine the position of objects based on motion discontinuities, and computes heading direction, or the direction of self-motion, from global optic flow. The cortical representation of heading interacts with the representations of a goal and obstacles such that the goal acts as an attractor of heading, while obstacles act as repellers. In addition the model maintains fixation on the goal object by generating smooth pursuit eye movements. Eye rotations can distort the optic flow field, complicating heading perception, and the model uses extraretinal signals to correct for this distortion and accurately represent heading. The model explains how motion processing mechanisms in cortical areas MT, MST, and posterior parietal cortex can be used to guide steering. The model quantitatively simulates human psychophysical data about visually-guided steering, obstacle avoidance, and route selection.Air Force Office of Scientific Research (F4960-01-1-0397); National Geospatial-Intelligence Agency (NMA201-01-1-2016); National Science Foundation (SBE-0354378); Office of Naval Research (N00014-01-1-0624

    Toward a self-organizing pre-symbolic neural model representing sensorimotor primitives

    Get PDF
    Copyright ©2014 Zhong, Cangelosi and Wermter.This is an open-access article distributed under the terms of the Creative Commons Attribution License (CCBY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these termsThe acquisition of symbolic and linguistic representations of sensorimotor behavior is a cognitive process performed by an agent when it is executing and/or observing own and others' actions. According to Piaget's theory of cognitive development, these representations develop during the sensorimotor stage and the pre-operational stage. We propose a model that relates the conceptualization of the higher-level information from visual stimuli to the development of ventral/dorsal visual streams. This model employs neural network architecture incorporating a predictive sensory module based on an RNNPB (Recurrent Neural Network with Parametric Biases) and a horizontal product model. We exemplify this model through a robot passively observing an object to learn its features and movements. During the learning process of observing sensorimotor primitives, i.e., observing a set of trajectories of arm movements and its oriented object features, the pre-symbolic representation is self-organized in the parametric units. These representational units act as bifurcation parameters, guiding the robot to recognize and predict various learned sensorimotor primitives. The pre-symbolic representation also accounts for the learning of sensorimotor primitives in a latent learning context.Peer reviewedFinal Published versio

    A Neural Model of Visually Guided Steering, Obstacle Avoidance, and Route Selection

    Full text link
    A neural model is developed to explain how humans can approach a goal object on foot while steering around obstacles to avoid collisions in a cluttered environment. The model uses optic flow from a 3D virtual reality environment to determine the position of objects based on motion discotinuities, and computes heading direction, or the direction of self-motion, from global optic flow. The cortical representation of heading interacts with the representations of a goal and obstacles such that the goal acts as an attractor of heading, while obstacles act as repellers. In addition the model maintains fixation on the goal object by generating smooth pursuit eye movements. Eye rotations can distort the optic flow field, complicating heading perception, and the model uses extraretinal signals to correct for this distortion and accurately represent heading. The model explains how motion processing mechanisms in cortical areas MT, MST, and VIP can be used to guide steering. The model quantitatively simulates human psychophysical data about visually-guided steering, obstacle avoidance, and route selection.Air Force Office of Scientific Research (F4960-01-1-0397); National Geospatial-Intelligence Agency (NMA201-01-1-2016); National Science Foundation (NSF SBE-0354378); Office of Naval Research (N00014-01-1-0624

    A topographic mechanism for arcing of dryland vegetation bands

    Full text link
    Banded patterns consisting of alternating bare soil and dense vegetation have been observed in water-limited ecosystems across the globe, often appearing along gently sloped terrain with the stripes aligned transverse to the elevation gradient. In many cases these vegetation bands are arced, with field observations suggesting a link between the orientation of arcing relative to the grade and the curvature of the underlying terrain. We modify the water transport in the Klausmeier model of water-biomass interactions, originally posed on a uniform hillslope, to qualitatively capture the influence of terrain curvature on the vegetation patterns. Numerical simulations of this modified model indicate that the vegetation bands change arcing-direction from convex-downslope when growing on top of a ridge to convex-upslope when growing in a valley. This behavior is consistent with observations from remote sensing data that we present here. Model simulations show further that whether bands grow on ridges, valleys, or both depends on the precipitation level. A survey of three banded vegetation sites, each with a different aridity level, indicates qualitatively similar behavior.Comment: 26 pages, 13 figures, 2 table

    Neural Dynamics of Autistic Behaviors: Cognitive, Emotional, and Timing Substrates

    Full text link
    What brain mechanisms underlie autism and how do they give rise to autistic behavioral symptoms? This article describes a neural model, called the iSTART model, which proposes how cognitive, emotional, timing, and motor processes may interact together to create and perpetuate autistic symptoms. These model processes were originally developed to explain data concerning how the brain controls normal behaviors. The iSTART model shows how autistic behavioral symptoms may arise from prescribed breakdowns in these brain processes.Air Force Office of Scientific Research (F49620-01-1-0397); Office of Naval Research (N00014-01-1-0624
    • …
    corecore