325,432 research outputs found
Embodied cognition through cultural interaction
In this short paper we describe a robotic setup to study the self-organization of conceptualisation and language. What distinguishes this project from others is that we envision a robot with specic cognitive capacities, but without resorting to any pre-programmed representations or conceptualisations. The key to this all is self-organization and enculturation. We report preliminary results on learning motor behaviours through imitation, and sketch how the language plays a pivoting role in constructing world representations
Disciplining the body? Reflections on the cross disciplinary import of âembodied meaningâ into interaction design
The aim of this paper is above all critically to examine and clarify some of the negative implications that the idea of âembodied meaningâ has for the emergent field of interaction design research.
Originally, the term âembodied meaningâ has been brought into HCI research from phenomenology and cognitive semantics in order to better understand how userâs experience of new technological systems relies to an increasing extent on full-body interaction. Embodied approaches to technology design could thus be found in Winograd & Flores (1986), Dourish (2001), Lund (2003), Klemmer, Hartman & Takayama (2006), Hornecker & Buur (2006), Hurtienne & Israel (2007) among others.
However, fertile as this cross-disciplinary import may be, design research can generally be criticised for being âundisciplinedâ, because of its tendency merely to take over reductionist ideas of embodied meaning from those neighbouring disciplines without questioning the inherent limitations it thereby subscribe to.
In this paper I focus on this reductionism and what it means for interaction design research. I start out by introducing the field of interaction design and two central research questions that it raises. This will serve as a prerequisite for understanding the overall intention of bringing the notion of âembodied meaningâ from cognitive semantics into design research. Narrowing my account down to the concepts of âimage schemasâ and their âmetaphorical extensionâ, I then explain in more detail what is reductionistic about the notion of embodied meaning. Having done so, I shed light on the consequences this reductionism might have for design research by examining a recently developed framework for intuitive user interaction along with two case examples. In so doing I sketch an alternative view of embodied meaning for interaction design research.
Keywords:
Interaction Design, Embodied Meaning, Tangible User Interaction, Design Theory, Cognitive Semiotics</p
Do Embodied Conversational Agents Know When to Smile?
We survey the role of humor in particular domains of human-to-human interaction with the aim of seeing whether it is useful for embodied conversational agents to integrate humor capabilities in their models of intelligence, emotions and interaction (verbal and nonverbal) Therefore we first look at the current state of the art of research in embodied conversational agents, affective computing and verbal and nonverbal interaction. We adhere to the 'Computers Are Social Actors' paradigm to assume that human conversational partners of embodied conversational agents assign human properties to these agents, including humor appreciation
Recommended from our members
Emphatic agents to reduce user frustration: The effects of varying agent characteristics
There is now growing interest in the development of computer systems which respond to usersâ emotion and affect. We report three small scale studies (with a total of 42 participants) which investigate the extent to which affective agents, using strategies derived from human-human interaction, can reduce user frustration within human-computer interaction. The results confirm the previous findings of Klein et al (2002) that such interventions can be effective. We also obtained results that suggest that embodied agents can be more effective at reducing frustration than non-embodied agents, and that female embodied agents may be more effective than male embodied agents. These results are discussed in light of the existing research literature
Embodied Musical Interaction
Music is a natural partner to human-computer interaction, offering tasks and use cases for novel forms of interaction. The richness of the relationship between a performer and their instrument in expressive musical performance can provide valuable insight to human-computer interaction (HCI) researchers interested in applying these forms of deep interaction to other fields. Despite the longstanding connection between music and HCI, it is not an automatic one, and its history arguably points to as many differences as it does overlaps. Music research and HCI research both encompass broad issues, and utilize a wide range of methods. In this chapter I discuss how the concept of embodied interaction can be one way to think about music interaction. I propose how the three âparadigmsâ of HCI and three design accounts from the interaction design literature can serve as a lens through which to consider types of music HCI. I use this conceptual framework to discuss three different musical projectsâHaptic Wave, Form Follows Sound, and BioMuse
The Forum Selection Defense
The central goal of this thesis is creating and testing technology toproduce embodied interaction experiences. Embodied interaction is thesense that we inhabit a digital space with our minds operating on it asif it were our physical bodies, without conscious thought, but as naturalas reaching out with your ngers and touching the object in front of you.Traditional interaction techniques such as keyboard and mouse get in theway of achieving embodiment. In this thesis, we have created an embodiedperspective of virtual three-dimensional objects oating in front of a user.Users can see the object from a rst-person perspective without a headsupdisplay and can change the perspective of the object by shifting theirpoint of view. The technology and aordances to make this possible in aunobtrusive, practical and ecient way is the subject of this thesis.Using a depth sensor, Microsoft's Kinect [7], we track the user's positionin front of a screen in real-time, thus making it possible to changethe perspectives seen by each of the user's eyes to t their real point ofview, in order to achieve a 3D embodied interaction outside the screen.We combined the rst-person perspective into an embodied sculptingproject that includes a wireless haptic glove to allow the user to feel whentouching the model and a small one-hand remote controller used to rotatethe object around as the user desires when pressing its single button.We have achieved what we call Embodied Perspective, which involves anoutside-screen stereoscopic visualization, which reacts to body interactionas if the visualization was really where the user perceives it, thanks to thedata from the depth sensor. This method does not block the user's viewof their own body, but ts and matches their brain's perception.When applied to virtual sculpting (embodied sculpting), it gives theuser the ability to feel and understand much better their actions; wherethey are touching/sculpting and how they should move to reach wherethey want, since the movements are the same one would perform withtheir body in a real-world sculpting situation.A further study of the viability of this method, not only on singleperson interaction but on group visualization of a single user perspective,is discussed and proposed
Grounding Dynamic Spatial Relations for Embodied (Robot) Interaction
This paper presents a computational model of the processing of dynamic
spatial relations occurring in an embodied robotic interaction setup. A
complete system is introduced that allows autonomous robots to produce and
interpret dynamic spatial phrases (in English) given an environment of moving
objects. The model unites two separate research strands: computational
cognitive semantics and on commonsense spatial representation and reasoning.
The model for the first time demonstrates an integration of these different
strands.Comment: in: Pham, D.-N. and Park, S.-B., editors, PRICAI 2014: Trends in
Artificial Intelligence, volume 8862 of Lecture Notes in Computer Science,
pages 958-971. Springe
Conversational Agents, Humorous Act Construction, and Social Intelligence
Humans use humour to ease communication problems in human-human interaction and \ud
in a similar way humour can be used to solve communication problems that arise\ud
with human-computer interaction. We discuss the role of embodied conversational\ud
agents in human-computer interaction and we have observations on the generation\ud
of humorous acts and on the appropriateness of displaying them by embodied\ud
conversational agents in order to smoothen, when necessary, their interactions\ud
with a human partner. The humorous acts we consider are generated spontaneously.\ud
They are the product of an appraisal of the conversational situation and the\ud
possibility to generate a humorous act from the elements that make up this\ud
conversational situation, in particular the interaction history of the\ud
conversational partners
Windsurfing : an extreme form of material and embodied interaction?
This paper makes reference to the development of water based board sports in the world of adventure or action games. With a specific focus on windsurfing, we use Parlebas (1999) and Warnier's (2001) theoretical interests in the praxaeology of physical learning as well as Mauss' (1935) work on techniques of the body. We also consider the implications of Csikzentimihalyi's notion of flow (1975). We argue that windsurfing equipment should not merely be seen as protection but rather as status objects through which extreme lifestyles are embodied and embodying
- âŠ