213 research outputs found
A nonparametric Bayesian approach toward robot learning by demonstration
In the past years, many authors have considered application of machine learning methodologies to effect robot learning by demonstration. Gaussian mixture regression (GMR) is one of the most successful methodologies used for this purpose. A major limitation of GMR models concerns automatic selection of the proper number of model states, i.e., the number of model component densities. Existing methods, including likelihood- or entropy-based criteria, usually tend to yield noisy model size estimates while imposing heavy computational requirements. Recently, Dirichlet process (infinite) mixture models have emerged in the cornerstone of nonparametric Bayesian statistics as promising candidates for clustering applications where the number of clusters is unknown a priori. Under this motivation, to resolve the aforementioned issues of GMR-based methods for robot learning by demonstration, in this paper we introduce a nonparametric Bayesian formulation for the GMR model, the Dirichlet process GMR model. We derive an efficient variational Bayesian inference algorithm for the proposed model, and we experimentally investigate its efficacy as a robot learning by demonstration methodology, considering a number of demanding robot learning by demonstration scenarios
Interaction primitives for human-robot cooperation tasks
To engage in cooperative activities with human
partners, robots have to possess basic interactive abilities
and skills. However, programming such interactive skills is a
challenging task, as each interaction partner can have different
timing or an alternative way of executing movements. In this
paper, we propose to learn interaction skills by observing how
two humans engage in a similar task. To this end, we introduce
a new representation called Interaction Primitives. Interaction
primitives build on the framework of dynamic motor primitives
(DMPs) by maintaining a distribution over the parameters of
the DMP. With this distribution, we can learn the inherent
correlations of cooperative activities which allow us to infer the
behavior of the partner and to participate in the cooperation.
We will provide algorithms for synchronizing and adapting the
behavior of humans and robots during joint physical activities
Symbol Emergence in Robotics: A Survey
Humans can learn the use of language through physical interaction with their
environment and semiotic communication with other people. It is very important
to obtain a computational understanding of how humans can form a symbol system
and obtain semiotic skills through their autonomous mental development.
Recently, many studies have been conducted on the construction of robotic
systems and machine-learning methods that can learn the use of language through
embodied multimodal interaction with their environment and other systems.
Understanding human social interactions and developing a robot that can
smoothly communicate with human users in the long term, requires an
understanding of the dynamics of symbol systems and is crucially important. The
embodied cognition and social interaction of participants gradually change a
symbol system in a constructive manner. In this paper, we introduce a field of
research called symbol emergence in robotics (SER). SER is a constructive
approach towards an emergent symbol system. The emergent symbol system is
socially self-organized through both semiotic communications and physical
interactions with autonomous cognitive developmental agents, i.e., humans and
developmental robots. Specifically, we describe some state-of-art research
topics concerning SER, e.g., multimodal categorization, word discovery, and a
double articulation analysis, that enable a robot to obtain words and their
embodied meanings from raw sensory--motor information, including visual
information, haptic information, auditory information, and acoustic speech
signals, in a totally unsupervised manner. Finally, we suggest future
directions of research in SER.Comment: submitted to Advanced Robotic
Fast, Robust, and Versatile Event Detection through HMM Belief State Gradient Measures
Event detection is a critical feature in data-driven systems as it assists
with the identification of nominal and anomalous behavior. Event detection is
increasingly relevant in robotics as robots operate with greater autonomy in
increasingly unstructured environments. In this work, we present an accurate,
robust, fast, and versatile measure for skill and anomaly identification. A
theoretical proof establishes the link between the derivative of the
log-likelihood of the HMM filtered belief state and the latest emission
probabilities. The key insight is the inverse relationship in which gradient
analysis is used for skill and anomaly identification. Our measure showed
better performance across all metrics than related state-of-the art works. The
result is broadly applicable to domains that use HMMs for event detection.Comment: 8 pages, 7 figures, double col, ieee conference forma
- …