6 research outputs found

    Autonomous Sequence Generation for a Neural Dynamic Robot: Scene Perception, Serial Order, and Object-Oriented Movement

    Get PDF
    Neurally inspired robotics already has a long history that includes reactive systems emulating reflexes, neural oscillators to generate movement patterns, and neural networks as trainable filters for high-dimensional sensory information. Neural inspiration has been less successful at the level of cognition. Decision-making, planning, building and using memories, for instance, are more often addressed in terms of computational algorithms than through neural process models. To move neural process models beyond reactive behavior toward cognition, the capacity to autonomously generate sequences of processing steps is critical. We review a potential solution to this problem that is based on strongly recurrent neural networks described as neural dynamic systems. Their stable states perform elementary motor or cognitive functions while coupled to sensory inputs. The state of the neural dynamics transitions to a new motor or cognitive function when a previously stable neural state becomes unstable. Only when a neural robotic system is capable of acting autonomously does it become a useful to a human user. We demonstrate how a neural dynamic architecture that supports autonomous sequence generation can engage in such interaction. A human user presents colored objects to the robot in a particular order, thus defining a serial order of color concepts. The user then exposes the system to a visual scene that contains the colored objects in a new spatial arrangement. The robot autonomously builds a scene representation by sequentially bringing objects into the attentional foreground. Scene memory updates if the scene changes. The robot performs visual search and then reaches for the objects in the instructed serial order. In doing so, the robot generalizes across time and space, is capable of waiting when an element is missing, and updates its action plans online when the scene changes. The entire flow of behavior emerges from a time-continuous neural dynamics without any controlling or supervisory algorithm

    A neural process model of intentionality implemented on an autonomous robot

    No full text
    Intentionalität, die Fähigkeit des Geistes, sich auf Objekte, deren Eigenschaften oder Ereignisse zu beziehen, liegt fast allen kognitiven Prozessen des täglichen Lebens zugrunde. Diese Arbeit entwickelt ein neuronales Prozessmodell für intentionale Zustände, welches auf der Dynamischen Feldtheorie basiert, einer mathematisch formalisierten Klasse von Modellen auf der Ebene neuronaler Populationen. Die Verankerung von intentionalen Zuständen wird anhand einer dynamischen Feldarchitektur demonstriert, die das Verhalten eines simulierten Roboteragenten steuert. Der Agent führt eine mehrstufige Aufgabe aus, die speziell so konzipiert wurde, dass sie verschiedene archetypische intentionale Zustände unterschiedlicher Abstraktionsebene erfordert: Wahrnehmung, Erinnerung und Annahmen sowie Handlung, Handlungsabsichten und Wünsche. Die präsentierte Feldarchitektur generiert autonom Sequenzen intentionaler Zustände, die letztendlich dazu führen, dass der Agent ein gewünschtes Ziel erreicht

    Autonomous sequence generation for a neural dynamic robot

    No full text
    Neurally inspired robotics already has a long history that includes reactive systems emulating reflexes, neural oscillators to generate movement patterns, and neural networks as trainable filters for high-dimensional sensory information. Neural inspiration has been less successful at the level of cognition. Decision-making, planning, building and using memories, for instance, are more often addressed in terms of computational algorithms than through neural process models. To move neural process models beyond reactive behavior toward cognition, the capacity to autonomously generate sequences of processing steps is critical. We review a potential solution to this problem that is based on strongly recurrent neural networks described as neural dynamic systems. Their stable states perform elementary motor or cognitive functions while coupled to sensory inputs. The state of the neural dynamics transitions to a new motor or cognitive function when a previously stable neural state becomes unstable. Only when a neural robotic system is capable of acting autonomously does it become a useful to a human user. We demonstrate how a neural dynamic architecture that supports autonomous sequence generation can engage in such interaction. A human user presents colored objects to the robot in a particular order, thus defining a serial order of color concepts. The user then exposes the system to a visual scene that contains the colored objects in a new spatial arrangement. The robot autonomously builds a scene representation by sequentially bringing objects into the attentional foreground. Scene memory updates if the scene changes. The robot performs visual search and then reaches for the objects in the instructed serial order. In doing so, the robot generalizes across time and space, is capable of waiting when an element is missing, and updates its action plans online when the scene changes. The entire flow of behavior emerges from a time-continuous neural dynamics without any controlling or supervisory algorithm
    corecore