1 research outputs found
Bio-inspired Gait Imitation of Hexapod Robot Using Event-Based Vision Sensor and Spiking Neural Network
Learning how to walk is a sophisticated neurological task for most animals.
In order to walk, the brain must synthesize multiple cortices, neural circuits,
and diverse sensory inputs. Some animals, like humans, imitate surrounding
individuals to speed up their learning. When humans watch their peers, visual
data is processed through a visual cortex in the brain. This complex problem of
imitation-based learning forms associations between visual data and muscle
actuation through Central Pattern Generation (CPG). Reproducing this imitation
phenomenon on low power, energy-constrained robots that are learning to walk
remains challenging and unexplored. We propose a bio-inspired feed-forward
approach based on neuromorphic computing and event-based vision to address the
gait imitation problem. The proposed method trains a "student" hexapod to walk
by watching an "expert" hexapod moving its legs. The student processes the flow
of Dynamic Vision Sensor (DVS) data with a one-layer Spiking Neural Network
(SNN). The SNN of the student successfully imitates the expert within a small
convergence time of ten iterations and exhibits energy efficiency at the
sub-microjoule level.Comment: 7 pages, 9 figures, to be published in proceeding of IEEE WCCI/IJCN