4,474 research outputs found
Semiotics and Human-Robot Interaction
Keywords: Semi-autonomous robot, human-robot interaction, semiotics. Abstract: This paper describes a robot control architecture supported on a human-robot interaction model obtained directly from semiotics concepts. The architecture is composed of a set of objects defined after a semiotic sign model. Simulation experiments using unicycle robots are presented that illustrate the interactions within a team of robots equipped with skills similar to those used in human-robot interactions.
Symbol Emergence in Robotics: A Survey
Humans can learn the use of language through physical interaction with their
environment and semiotic communication with other people. It is very important
to obtain a computational understanding of how humans can form a symbol system
and obtain semiotic skills through their autonomous mental development.
Recently, many studies have been conducted on the construction of robotic
systems and machine-learning methods that can learn the use of language through
embodied multimodal interaction with their environment and other systems.
Understanding human social interactions and developing a robot that can
smoothly communicate with human users in the long term, requires an
understanding of the dynamics of symbol systems and is crucially important. The
embodied cognition and social interaction of participants gradually change a
symbol system in a constructive manner. In this paper, we introduce a field of
research called symbol emergence in robotics (SER). SER is a constructive
approach towards an emergent symbol system. The emergent symbol system is
socially self-organized through both semiotic communications and physical
interactions with autonomous cognitive developmental agents, i.e., humans and
developmental robots. Specifically, we describe some state-of-art research
topics concerning SER, e.g., multimodal categorization, word discovery, and a
double articulation analysis, that enable a robot to obtain words and their
embodied meanings from raw sensory--motor information, including visual
information, haptic information, auditory information, and acoustic speech
signals, in a totally unsupervised manner. Finally, we suggest future
directions of research in SER.Comment: submitted to Advanced Robotic
Comics, robots, fashion and programming: outlining the concept of actDresses
This paper concerns the design of physical languages for controlling and programming robotic consumer products. For this purpose we explore basic theories of semiotics represented in the two separate fields of comics and
fashion, and how these could be used as resources in the development of new physical languages. Based on these theories, the design concept of actDresses is defined, and supplemented by three example scenarios of how the concept can be used for controlling, programming, and
predicting the behaviour of robotic systems
Observing Environments
> Context • Society is faced with “wicked” problems of environmental sustainability, which are inherently multiperspectival, and there is a need for explicitly constructivist and perspectivist theories to address them.
> Problem • However, different constructivist theories construe the environment in different ways. The aim of this paper is to clarify the conceptions of environment in constructivist approaches, and thereby to assist the sciences of complex systems and complex environmental problems.
> Method • We describe the terms used for “the environment” in von Uexküll, Maturana & Varela, and Luhmann, and analyse how their conceptions of environment are connected to differences of perspective and observation.
> Results • We show the need to distinguish between inside and outside perspectives on the environment, and identify two very different and complementary logics of observation, the logic of distinction and the logic of representation, in the three constructivist theories.
> Implications • Luhmann’s theory of social systems can be a helpful perspective on the wicked environmental problems of society if we consider carefully the theory’s own blind spots: that it confines itself to systems of communication, and that it is based fully on the conception of observation as indication by means of distinction
Disciplining the body? Reflections on the cross disciplinary import of ‘embodied meaning’ into interaction design
The aim of this paper is above all critically to examine and clarify some of the negative implications that the idea of ‘embodied meaning’ has for the emergent field of interaction design research.
Originally, the term ‘embodied meaning’ has been brought into HCI research from phenomenology and cognitive semantics in order to better understand how user’s experience of new technological systems relies to an increasing extent on full-body interaction. Embodied approaches to technology design could thus be found in Winograd & Flores (1986), Dourish (2001), Lund (2003), Klemmer, Hartman & Takayama (2006), Hornecker & Buur (2006), Hurtienne & Israel (2007) among others.
However, fertile as this cross-disciplinary import may be, design research can generally be criticised for being ‘undisciplined’, because of its tendency merely to take over reductionist ideas of embodied meaning from those neighbouring disciplines without questioning the inherent limitations it thereby subscribe to.
In this paper I focus on this reductionism and what it means for interaction design research. I start out by introducing the field of interaction design and two central research questions that it raises. This will serve as a prerequisite for understanding the overall intention of bringing the notion of ‘embodied meaning’ from cognitive semantics into design research. Narrowing my account down to the concepts of ‘image schemas’ and their ‘metaphorical extension’, I then explain in more detail what is reductionistic about the notion of embodied meaning. Having done so, I shed light on the consequences this reductionism might have for design research by examining a recently developed framework for intuitive user interaction along with two case examples. In so doing I sketch an alternative view of embodied meaning for interaction design research.
Keywords:
Interaction Design, Embodied Meaning, Tangible User Interaction, Design Theory, Cognitive Semiotics</p
Interaction and Experience in Enactive Intelligence and Humanoid Robotics
We overview how sensorimotor experience can be operationalized for interaction scenarios in which humanoid robots acquire skills and linguistic behaviours via enacting a “form-of-life”’ in interaction games (following Wittgenstein) with humans. The enactive paradigm is introduced which provides a powerful framework for the construction of complex adaptive systems, based on interaction, habit, and experience. Enactive cognitive architectures (following insights of Varela, Thompson and Rosch) that we have developed support social learning and robot ontogeny by harnessing information-theoretic methods and raw uninterpreted sensorimotor experience to scaffold the acquisition of behaviours. The success criterion here is validation by the robot engaging in ongoing human-robot interaction with naive participants who, over the course of iterated interactions, shape the robot’s behavioural and linguistic development. Engagement in such interaction exhibiting aspects of purposeful, habitual recurring structure evidences the developed capability of the humanoid to enact language and interaction games as a successful participant
- …