15 research outputs found

    Emotional design and human-robot interaction

    Get PDF
    Recent years have shown an increase in the importance of emotions applied to the Design field - Emotional Design. In this sense, the emotional design aims to elicit (e.g., pleasure) or prevent (e.g., displeasure) determined emotions, during human product interaction. That is, the emotional design regulates the emotional interaction between the individual and the product (e.g., robot). Robot design has been a growing area whereby robots are interacting directly with humans in which emotions are essential in the interaction. Therefore, this paper aims, through a non-systematic literature review, to explore the application of emotional design, particularly on Human-Robot Interaction. Robot design features (e.g., appearance, expressing emotions and spatial distance) that affect emotional design are introduced. The chapter ends with a discussion and a conclusion.info:eu-repo/semantics/acceptedVersio

    Imitating human motion using humanoid upper body models

    Get PDF
    Includes abstract.Includes bibliographical references.This thesis investigates human motion imitation of five different humanoid upper bodies (comprised of the torso and upper limbs) using human dance motion as a case study. The humanoid models are based on five existing humanoids, namely, ARMAR, HRP-2, SURALP, WABIAN-2, and WE-4RII. These humanoids are chosen for their different structures and range of joint motion

    A New Miniaturised Multi-Axis Force/Torque Sensors Based on Optoelectronic Technology and Simply-Supported Beam

    Get PDF
    This paper presents a methodology for the development of a multi-axis force/torque sensor based on optoelectronic technology. The advantages of using this sensing principle are the low manufacturing costs, the simple fabrication, and the immunity to electrical noise. The force/ torque sensor makes use of six optical sensors: each sensor measures the displacement of a reflective surface that moves integrally with a simply-supported beam. The proposed mechanical structure allows for a variety of shapes on the mechanical structure to be easily adaptable to many robot applications. In this paper, we present a five-axis force/torque sensor based on this optoelectronic principle. To measure force/torque components, two identical three-DoF force/torque sensor structures (comprised of three beams) are mounted on top of each other. Photo sensors and mirrors are fixed inside the structure to measure the six beam deflections. In this paper, we describe the sensor structure, design, fabrication, calibration, and verify our sensor development methodology

    Mutual Recognition in Human-Robot Interaction: a Deflationary Account

    Get PDF
    Mutually adaptive interaction involves the robot as a partner as opposed to a tool, and requires that the robot is susceptible to similar environmental cues and behavior patterns as humans are. Recognition, or the acknowledgement of the other as person, is fundamental to mutually adaptive interaction between humans. We discuss what embodied recognition involves and its behavioral manifestations, and describe the benefits of implementing it in HRI

    Examining Cognitive Empathy Elements within AI Chatbots for Healthcare Systems

    Get PDF
    Empathy is an essential part of communication in healthcare. It is a multidimensional concept and the two key dimensions: emotional and cognitive empathy allow clinicians to understand a patient’s situation, reasoning, and feelings clearly (Mercer and Reynolds, 2002). As artificial intelligence (AI) is increasingly being used in healthcare for many routine tasks, accurate diagnoses, and complex treatment plans, it is becoming more crucial to incorporate clinical empathy into patient-faced AI systems. Unless patients perceive that the AI is understanding their situation, the communication between patient and AI may not sustain efficiently. AI may not really exhibit any emotional empathy at present, but it has the capability to exhibit cognitive empathy by communicating how it can understand patients’ reasoning, perspectives, and point of view. In my dissertation, I examine this issue across three separate lab experiments and one interview study. At first, I developed AI Cognitive Empathy Scale (AICES) and tested all empathy (emotional and cognitive) components together in a simulated scenario against control for patient-AI interaction for diagnosis purposes. In the second experiment, I tested the empathy components separately against control in different simulated scenarios. I identified six cognitive empathy elements from the interview study with first-time mothers, two of these elements were unique from the past literature. In the final lab experiment, I tested different cognitive empathy components separately based on the results from the interview study in simulated scenarios to examine which element emerges as the most effective. Finally, I developed a conceptual model of cognitive empathy for patient-AI interaction connecting the past literature and the observations from my studies. Overall, cognitive empathy elements show promise to create a shared understanding in patients-AI communication that may lead to increased patient satisfaction and willingness to use AI systems for initial diagnosis purposes
    corecore