CORE
🇺🇦
make metadata, not war
Services
Services overview
Explore all CORE services
Access to raw data
API
Dataset
FastSync
Content discovery
Recommender
Discovery
OAI identifiers
OAI Resolver
Managing content
Dashboard
Bespoke contracts
Consultancy services
Support us
Support us
Membership
Sponsorship
Community governance
Advisory Board
Board of supporters
Research network
About
About us
Our mission
Team
Blog
FAQs
Contact us
Integration of emotion expression and visual tracking locomotion based on vestibulo-ocular reflex
Authors
,
K Endo
+5 more
N Endo
K Hashimoto
F Iida
T Kojima
A Takanishi
Publication date
13 December 2010
Publisher
Abstract
Personal robots anticipated to become popular in the future are required to be active in joint work and community life with humans. These personal robots must recognize changing environment and must conduct adequate actions like human. Visual tracking can be said as a fundamental function from the view point of environmental sensing and reflex reaction against it. The authors developed a visual tracking motion algorithm by using upper body. Then, we integrated it with an online walking pattern generator and developed a visual tracking biped locomotion. Finally, we conducted an experimental evaluation with emotion expression. © 2010 IEEE
Similar works
Full text
Available Versions
CUED - Cambridge University Engineering Department
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:generic.eprints.org:992247...
Last time updated on 15/07/2020