3 research outputs found

    Making Robotic Dogs Detect Objects That Real Dogs Recognize Naturally: A Pilot Study

    Get PDF
    The recent advancements in artificial intelligence (AI) and deep learning have enabled smart products, such as smart toys and robotic dogs, to interact with humans more intelligently and express emotions. As a result, such products become intensively sensorized and integrate multi-modal interaction techniques to detect and infer emotions from spoken utterances, motions, pointing gestures and observed objects, and to plan their actions. However, even for the predictive purposes, a practical challenge for these smart products is that deep learning algorithms typically require high computing power, especially when applying a multimodal method. Moreover, the memory needs for deep learning models usually surpass the limit of many low-end mobile computing devices as their complexities boost up. In this study, we explore the application of lightweight deep neural networks, SqueezeDet model and Single Shot Multi-Box Detector (SSD) model with MobileNet as the backbone, to detect canine beloved objects. These lightweight models are expected to be integrated into a multi-modal emotional support robotics system designed for a smart robot dog. We also introduce our future research works in this direction

    iRobot : conceptualising SERVBOT for humanoid social robots

    Get PDF
    Services are intangible in nature and, as a result, it is often difficult to measure the quality of the service. The service is usually delivered by a human to a human customer and the service literature shows SERVQUAL can be used to measure the quality of the service. However, the use of social robots during the pandemic is speeding up the process of employing social roots in frontline service settings. An extensive review of the literature shows there is a lack of an empirical model to assess the perceived service quality provided by a social robot. Furthermore, the social robot literature highlights key differences between human service and social robots. For example, scholars have highlighted the importance of entertainment and engagement in the adoption of social robots in the service industry. However, it is unclear whether the SERVQUAL dimensions are appropriate to measure social robots’ service quality. This master’s project will conceptualise the SERVBOT model to assess a social robot’s service quality. It identifies reliability, responsiveness, assurance, empathy, and entertainment as the five dimensions of SERVBOT. Further, the research will investigate how these five factors influence emotional and social engagement and intention to use the social robot in a concierge service setting. To conduct the research, a 2 x 1 (CONTROL vs SERVBOT) x (Concierge) between-subject experiment was undertaken and a total of 232 responses were collected for both stages. The results indicate that entertainment has a positive influence on emotional engagement when service is delivered by a human concierge. Further, assurance had a positive influence on social engagement when a human concierge provided the service. When a social robot concierge delivered the service, empathy and entertainment both influenced emotional engagement, and assurance and entertainment impacted social engagement favourably. For both CONTROL (human concierge) and SERVBOT (social robot concierge), emotional and social engagement had a significant influence on intentions to use. This study is the first to propose the SERVBOT model to measure social robots’ service quality. The model provides a theoretical underpinning on the key service quality dimensions of a social robot and gives scholars and managers a method to track the service quality of a social robot. The study also extends the literature by exploring the key factors that influence the use of social robots (i.e., emotional and social engagement)

    Investigating emotional interaction with a robotic dog

    No full text
    Next generation of consumer-level entertainment robots should offer more natural engaging interaction. This paper reports on the development and evaluation of a consumer-level robotic dog with acoustic emotion recognition capabilities. The dog can recognise the emotional state of its owner from affective cues in the owner’s speech and respond with appropriate actions. The evaluation study shows that users can recognise the new robotic dog to be emotionally intelligent and report that this makes the dog appear more ‘alive’
    corecore