265 research outputs found

    Calming Effects of Touch in Human, Animal, and Robotic Interaction—Scientific State-of-the-Art and Technical Advances

    Get PDF
    Small everyday gestures such as a tap on the shoulder can affect the way humans feel and act. Touch can have a calming effect and alter the way stress is handled, thereby promoting mental and physical health. Due to current technical advances and the growing role of intelligent robots in households and healthcare, recent research also addressed the potential of robotic touch for stress reduction. In addition, touch by non-human agents such as animals or inanimate objects may have a calming effect. This conceptual article will review a selection of the most relevant studies reporting the physiological, hormonal, neural, and subjective effects of touch on stress, arousal, and negative affect. Robotic systems capable of non-social touch will be assessed together with control strategies and sensor technologies. Parallels and differences of human-to-human touch and human-to-non-human touch will be discussed. We propose that, under appropriate conditions, touch can act as (social) signal for safety, even when the interaction partner is an animal or a machine. We will also outline potential directions for future research and clinical relevance. Thereby, this review can provide a foundation for further investigations into the beneficial contribution of touch by different agents to regulate negative affect and arousal in humans

    The power of affective touch within social robotics

    Get PDF
    There have been many leaps and bounds within social robotics, especially within human-robot interaction and how to make it a more meaningful relationship. This is traditionally accomplished through communicating via vision and sound. It has been shown that humans naturally seek interaction through touch yet the implications on emotions is unknown both in human-human interaction and social human-robot interaction. This thesis unpacks the social robotics community and the research undertaken to show a significant gap in the use of touch as a form of communication. The meaning behind touch will be investigated and what implication it has on emotions. A simplistic prototype was developed focusing on texture and breathing. This was used to carry out experiments to find out which combination of texture and movement felt natural. This proved to be a combination of synthetic fur and 14 breaths per minute. For human’s touch is said to be the most natural way of communicating emotions, this is the first step in achieving successful human-robot interaction in a more natural human-like way

    Socially intelligent robots that understand and respond to human touch

    Get PDF
    Touch is an important nonverbal form of interpersonal interaction which is used to communicate emotions and other social messages. As interactions with social robots are likely to become more common in the near future these robots should also be able to engage in tactile interaction with humans. Therefore, the aim of the research presented in this dissertation is to work towards socially intelligent robots that can understand and respond to human touch. To become a socially intelligent actor a robot must be able to sense, classify and interpret human touch and respond to this in an appropriate manner. To this end we present work that addresses different parts of this interaction cycle. The contributions of this dissertation are the following. We have made a touch gesture dataset available to the research community and have presented benchmark results. Furthermore, we have sparked interest into the new field of social touch recognition by organizing a machine learning challenge and have pinpointed directions for further research. Also, we have exposed potential difficulties for the recognition of social touch in more naturalistic settings. Moreover, the findings presented in this dissertation can help to inform the design of a behavioral model for robot pet companions that can understand and respond to human touch. Additionally, we have focused on the requirements for tactile interaction with robot pets for health care applications

    Morphology of socially assistive robots for health and social care: A reflection on 24 months of research with anthropomorphic, zoomorphic and mechanomorphic devices

    Get PDF
    Contains fulltext : 236633.pdf (Publisher’s version ) (Closed access)This paper reflects on four studies completed over the last 24 months, with social robots including Pepper, Paro, Joy for All cats and dogs, Miro, Pleo, Padbot and cheaper toys, including i) focus groups and interviews on suitable robot pet design, ii) surveys on ethical perceptions of robot pets, and iii) recorded interactions between stakeholders and a range of social robots. In total, up to 371 participants' views were included across the analysed studies. Data was reviewed and mined for relevance to the use and impact of morphology types for social robots in health and social care. Results suggested biomorphic design was preferable over mechanomorphic, and speech and life-simulation features (such as breathing) were well received. Anthropomorphism demonstrated some limitations in evoking fear and task-expectations that were absent for zoomorphic designs. The combination of familiar, zoomorphic appearance with animacy, life-simulation and speech capabilities thus appeared to be an area of research for future robots developed for health and social care.2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN) (Vancouver, Canada, 8-12 Aug. 2021

    Anthropomorphic Objects

    Get PDF
    This thesis exhibition is the culmination of an exploration of the uncanny through sculptures that evoke the sensation of a living presence. Each sculpture is also intended to convey some character or personality, and to this end, my work is influenced by puppetry. Though the works are human sized, they function as puppets in that they are posable and can be used for performance, but they are also robotic in that they have some autonomous motion and some reactive motion. My sculptures are based on the human form because the human form is at once most uncanny and also most relatable. Relatability is an important aspect of my work, as I use my humanoid sculptures to create playful interactive experiences for viewers, experiences that hinge on the uncanny and the illusion of presence

    Dyslexia and Mindfulness: Can Mental Training Ameliorate the Symptoms of Dyslexia?

    Get PDF
    Dyslexia (DYS) can be defined as a reading disorder that is not caused by sensory or cognitive deficits, or by a lack of motivation or adequate reading instruction. Remediation of a deficit in phonological processing has been the focus of most DYS interventions to date, but these studies have had despairingly little impact on generalized reading abilities. Reading Recovery and mindfulness (MF) training are two interventions that emphasize the development of metacognition. Reading Recovery teaches children how to use multiple metacognitive strategies (e.g., using context clues, making predictions) while in the process of decoding and comprehending text. MF, or mental training, is a well-established technique for developing attentional capacities and can also be considered a metacognitive skill. In this mixed-methods study, I investigated whether training in metacognitive strategies (including MF) would significantly improve reading and writing skills compared to a control condition. Twenty students in grades 2-5 with an identified learning disability were recruited from the public school district. After matching on age, severity, gender, and primary language, participants were randomly assigned to an experimental group or active control group. Participants in the experimental group received a five-week intervention that incorporated phonics training, Reading Recovery, and MF. Subjects in the control group received only phonics training for five weeks. Pre- and post-measures were collected on reading, writing, and a lexical-decision task. Quantitative results demonstrated that MF significantly increased response times during decoding (indicating a possible increase in reflectiveness due to metacognitive processes) and significantly lowered heart rate over the course of the intervention. Qualitative themes pointed to improvement in self-expression, motivation, focus, self-confidence, positive affect, and use of metacognitive strategies

    Displays of Jealousy in Dogs

    Get PDF
    Wolves (Canis Lupis) were domesticated into the common dog (Canis Familiaris) at least 15 thousand years ago. The domestication process changed wolves both physically and neurologically. Dogs now have a unique connection with humans, and display many of the same personality traits and cognitive deficits as humans do. Research by Harris and Prouvost (2014) has suggested that dogs can display jealous reactions. In this thesis, dogs were exposed to either a plastic Jack-O-Lantern stimulus or a plush dog stimulus and recorded their behavioral and physiological reactions to such stimuli. The results show that the majority of the differences in the dogs’ behavior was in interest and over arousal in the jealousy condition. This result suggests a potential jealousy-like reaction, but the current research does not seem to replicate the findings of Harris and Prouvost (2014) where it can be definitively stated that the dogs were jealous

    Shared Perception in Human-Robot Interaction

    Get PDF
    Interaction can be seen as a composition of perspectives: the integration of perceptions, intentions, and actions on the environment two or more agents share. For an interaction to be effective, each agent must be prone to “sharedness”: being situated in a common environment, able to read what others express about their perspective, and ready to adjust one’s own perspective accordingly. In this sense, effective interaction is supported by perceiving the environment jointly with others, a capability that in this research is called Shared Perception. Nonetheless, perception is a complex process that brings the observer receiving sensory inputs from the external world and interpreting them based on its own, previous experiences, predictions, and intentions. In addition, social interaction itself contributes to shaping what is perceived: others’ attention, perspective, actions, and internal states may also be incorporated into perception. Thus, Shared perception reflects the observer's ability to integrate these three sources of information: the environment, the self, and other agents. If Shared Perception is essential among humans, it is equally crucial for interaction with robots, which need social and cognitive abilities to interact with humans naturally and successfully. This research deals with Shared Perception within the context of Social Human-Robot Interaction (HRI) and involves an interdisciplinary approach. The two general axes of the thesis are the investigation of human perception while interacting with robots and the modeling of robot’s perception while interacting with humans. Such two directions are outlined through three specific Research Objectives, whose achievements represent the contribution of this work. i) The formulation of a theoretical framework of Shared Perception in HRI valid for interpreting and developing different socio-perceptual mechanisms and abilities. ii) The investigation of Shared Perception in humans focusing on the perceptual mechanism of Context Dependency, and therefore exploring how social interaction affects the use of previous experience in human spatial perception. iii) The implementation of a deep-learning model for Addressee Estimation to foster robots’ socio-perceptual skills through the awareness of others’ behavior, as suggested in the Shared Perception framework. To achieve the first Research Objective, several human socio-perceptual mechanisms are presented and interpreted in a unified account. This exposition parallels mechanisms elicited by interaction with humans and humanoid robots and aims to build a framework valid to investigate human perception in the context of HRI. Based on the thought of D. Davidson and conceived as the integration of information coming from the environment, the self, and other agents, the idea of "triangulation" expresses the critical dynamics of Shared Perception. Also, it is proposed as the functional structure to support the implementation of socio-perceptual skills in robots. This general framework serves as a reference to fulfill the other two Research Objectives, which explore specific aspects of Shared Perception. For what concerns the second Research Objective, the human perceptual mechanism of Context Dependency is investigated, for the first time, within social interaction. Human perception is based on unconscious inference, where sensory inputs integrate with prior information. This phenomenon helps in facing the uncertainty of the external world with predictions built upon previous experience. To investigate the effect of social interaction on such a mechanism, the iCub robot has been used as an experimental tool to create an interactive scenario with a controlled setting. A user study based on psychophysical methods, Bayesian modeling, and a neural network analysis of human results demonstrated that social interaction influenced Context Dependency so that when interacting with a social agent, humans rely less on their internal models and more on external stimuli. Such results are framed in Shared Perception and contribute to revealing the integration dynamics of the three sources of Shared Perception. The others’ presence and social behavior (other agents) affect the balance between sensory inputs (environment) and personal history (self) in favor of the information shared with others, that is, the environment. The third Research Objective consists of tackling the Addressee Estimation problem, i.e., understanding to whom a speaker is talking, to improve the iCub social behavior in multi-party interactions. Addressee Estimation can be considered a Shared Perception ability because it is achieved by using sensory information from the environment, internal representations of the agents’ position, and, more importantly, the understanding of others’ behavior. An architecture for Addressee Estimation is thus designed considering the integration process of Shared Perception (environment, self, other agents) and partially implemented with respect to the third element: the awareness of others’ behavior. To achieve this, a hybrid deep-learning (CNN+LSTM) model is developed to estimate the speaker-robot relative placement of the addressee based on the non-verbal behavior of the speaker. Addressee Estimation abilities based on Shared Perception dynamics are aimed at improving multi-party HRI. Making robots aware of other agents’ behavior towards the environment is the first crucial step for incorporating such information into the robot’s perception and modeling Shared Perception
    • …
    corecore