146 research outputs found

    Behavioural attentiveness patterns analysis – detecting distraction behaviours

    Get PDF
    The capacity of remaining focused on a task can be crucial in some circumstances. In general, this ability is intrinsic in a human social interaction and it is naturally used in any social context. Nevertheless, some individuals have difficulties in remaining concentrated in an activity, resulting in a short attention span. Children with Autism Spectrum Disorder (ASD) are a special example of such individuals. ASD is a group of complex developmental disorders of the brain. Individuals affected by this disorder are characterized by repetitive patterns of behaviour, restricted activities or interests, and impairments in social communication. The use of robots has already proved to encourage the developing of social interaction skills lacking in children with ASD. However, most of these systems are controlled remotely and cannot adapt automatically to the situation, and even those who are more autonomous still cannot perceive whether or not the user is paying attention to the instructions and actions of the robot. Following this trend, this dissertation is part of a research project that has been under development for some years. In this project, the Robot ZECA (Zeno Engaging Children with Autism) from Hanson Robotics is used to promote the interaction with children with ASD helping them to recognize emotions, and to acquire new knowledge in order to promote social interaction and communication with the others. The main purpose of this dissertation is to know whether the user is distracted during an activity. In the future, the objective is to interface this system with ZECA to consequently adapt its behaviour taking into account the individual affective state during an emotion imitation activity. In order to recognize human distraction behaviours and capture the user attention, several patterns of distraction, as well as systems to automatically detect them, have been developed. One of the most used distraction patterns detection methods is based on the measurement of the head pose and eye gaze. The present dissertation proposes a system based on a Red Green Blue (RGB) camera, capable of detecting the distraction patterns, head pose, eye gaze, blinks frequency, and the user to position towards the camera, during an activity, and then classify the user's state using a machine learning algorithm. Finally, the proposed system is evaluated in a laboratorial and controlled environment in order to verify if it is capable to detect the patterns of distraction. The results of these preliminary tests allowed to detect some system constraints, as well as to validate its adequacy to later use it in an intervention setting.A capacidade de permanecer focado numa tarefa pode ser crucial em algumas circunstâncias. No geral, essa capacidade é intrínseca numa interação social humana e é naturalmente usada em qualquer contexto social. No entanto, alguns indivíduos têm dificuldades em permanecer concentrados numa atividade, resultando num curto período de atenção. Crianças com Perturbações do Espectro do Autismo (PEA) são um exemplo especial de tais indivíduos. PEA é um grupo de perturbações complexas do desenvolvimento do cérebro. Os indivíduos afetados por estas perturbações são caracterizados por padrões repetitivos de comportamento, atividades ou interesses restritos e deficiências na comunicação social. O uso de robôs já provaram encorajar a promoção da interação social e ajudaram no desenvolvimento de competências deficitárias nas crianças com PEA. No entanto, a maioria desses sistemas é controlada remotamente e não consegue-se adaptar automaticamente à situação, e mesmo aqueles que são mais autônomos ainda não conseguem perceber se o utilizador está ou não atento às instruções e ações do robô. Seguindo esta tendência, esta dissertação é parte de um projeto de pesquisa que vem sendo desenvolvido há alguns anos, onde o robô ZECA (Zeno Envolvendo Crianças com Autismo) da Hanson Robotics é usado para promover a interação com crianças com PEA, ajudando-as a reconhecer emoções, adquirir novos conhecimentos para promover a interação social e comunicação com os pares. O principal objetivo desta dissertação é saber se o utilizador está distraído durante uma atividade. No futuro, o objetivo é fazer a interface deste sistema com o ZECA para, consequentemente, adaptar o seu comportamento tendo em conta o estado afetivo do utilizador durante uma atividade de imitação de emoções. A fim de reconhecer os comportamentos de distração humana e captar a atenção do utilizador, vários padrões de distração, bem como sistemas para detetá-los automaticamente, foram desenvolvidos. Um dos métodos de deteção de padrões de distração mais utilizados baseia-se na medição da orientação da cabeça e da orientação do olhar. A presente dissertação propõe um sistema baseado numa câmera Red Green Blue (RGB), capaz de detetar os padrões de distração, orientação da cabeça, orientação do olhar, frequência do piscar de olhos e a posição do utilizador em frente da câmera, durante uma atividade, e então classificar o estado do utilizador usando um algoritmo de “machine learning”. Por fim, o sistema proposto é avaliado num ambiente laboratorial, a fim de verificar se é capaz de detetar os padrões de distração. Os resultados destes testes preliminares permitiram detetar algumas restrições do sistema, bem como validar a sua adequação para posteriormente utilizá-lo num ambiente de intervenção

    Exploring the role of trust and expectations in CRI using in-the-wild studies

    Get PDF
    Studying interactions of children with humanoid robots in familiar spaces in natural contexts has become a key issue for social robotics. To fill this need, we conducted several Child-Robot Interaction (CRI) events with the Pepper robot in Polish and Japanese kindergartens. In this paper, we explore the role of trust and expectations towards the robot in determining the success of CRI. We present several observations from the video recordings of our CRI events and the transcripts of free-format question-answering sessions with the robot using the Wizard-of-Oz (WOZ) methodology. From these observations, we identify children’s behaviors that indicate trust (or lack thereof) towards the robot, e.g., challenging behavior of a robot or physical interactions with it. We also gather insights into children’s expectations, e.g., verifying expectations as a causal process and an agency or expectations concerning the robot’s relationships, preferences and physical and behavioral capabilities. Based on our experiences, we suggest some guidelines for designing more effective CRI scenarios. Finally, we argue for the effectiveness of in-the-wild methodologies for planning and executing qualitative CRI studies

    Between Fear and Trust: Factors Influencing Older Adults' Evaluation of Socially Assistive Robots

    Full text link
    Socially Assistive Robots (SARs) are expected to support autonomy, aging in place, and wellbeing in later life. For successful assimilation, it is necessary to understand factors affecting older adults Quality Evaluations (QEs) of SARs, including the pragmatic and hedonic evaluations and overall attractiveness. Previous studies showed that trust in robots significantly enhances QE, while technophobia considerably decreases it. The current study aimed to examine the relative impact of these two factors on older persons QE of SARs. The study was based on an online survey of 384 individuals aged 65 and above. Respondents were presented with a video of a robotic system for physical and cognitive training and filled out a questionnaire relating to that system. The results indicated a positive association between trust and QE and a negative association between technophobia and QE. A simultaneous exploration demonstrated that the relative impact of technophobia is significantly more substantial than that of trust. In addition, the pragmatic qualities of the robot were found to be more crucial to its QE than the social aspects of use. The findings suggest that implementing robotics technology in later life strongly depends on reducing older adults technophobia regarding the convenience of using SARs and highlight the importance of simultaneous explorations of facilitators and inhibitors

    Boosting children's creativity through creative interactions with social robots

    Get PDF
    Creativity is an ability with psychological and developmental benefits. Creative levels are dynamic and oscillate throughout life, with a first major decline occurring at the age of 7 years old. However, creativity is an ability that can be nurtured if trained, with evidence suggesting an increase in this ability with the use of validated creativity training. Yet, creativity training for young children (aged between 6-9 years old) appears as scarce. Additionally, existing training interventions resemble test-like formats and lack of playful dynamics that could engage children in creative practices over time. This PhD project aimed at contributing to creativity stimulation in children by proposing to use social robots as intervention tools, thus adding playful and interactive dynamics to the training. Towards this goal, we conducted three studies in schools, summer camps, and museums for children, that contributed to the design, fabrication, and experimental testing of a robot whose purpose was to re-balance creative levels. Study 1 (n = 140) aimed at testing the effect of existing activities with robots in creativity and provided initial evidence of the positive potential of robots for creativity training. Study 2 (n = 134) aimed at including children as co-designers of the robot, ensuring the robot’s design meets children’s needs and requirements. Study 3 (n = 130) investigated the effectiveness of this robot as a tool for creativity training, showing the potential of robots as creativity intervention tools. In sum, this PhD showed that robots can have a positive effect on boosting the creativity of children. This places social robots as promising tools for psychological interventions.Criatividade é uma habilidade com benefícios no desenvolvimento saudável. Os níveis de criatividade são dinâmicos e oscilam durante a vida, sendo que o primeiro maior declínio acontece aos 7 anos de idade. No entanto, a criatividade é uma habilidade que pode ser nutrida se treinada e evidências sugerem um aumento desta habilidade com o uso de programas validados de criatividade. Ainda assim, os programas de criatividade para crianças pequenas (entre os 6-9 anos de idade) são escassos. Adicionalmente, estes programas adquirem o formato parecido ao de testes, faltando-lhes dinâmicas de brincadeira e interatividade que poderão motivar as crianças a envolverem-se em práticas criativas ao longo do tempo. O presente projeto de doutoramento procurou contribuir para a estimulação da criatividade em crianças propondo usar robôs sociais como ferramenta de intervenção, adicionando dinâmicas de brincadeira e interação ao treino. Assim, conduzimos três estudos em escolas, campos de férias, e museus para crianças que contribuíram para o desenho, fabricação, e teste experimental de um robô cujo objetivo é ser uma ferramenta que contribui para aumentar os níveis de criatividade. O Estudo 1 (n = 140) procurou testar o efeito de atividade já existentes com robôs na criatividade e mostrou o potencial positivo do uso de robôs para o treino criativo. O Estudo 2 (n = 134) incluiu crianças como co-designers do robô, assegurando que o desenho do robô correspondeu às necessidades das crianças. O Estudo 2 (n = 130) investigou a eficácia deste robô como ferramenta para a criatividade, demonstrando o seu potencial para o treino da criatividade. Em suma, o presente doutoramento mostrou que os robôs poderão ter um potencial criativo em atividades com crianças. Desta forma, os robôs sociais poderão ser ferramentas promissoras em intervenções na psicologia

    Human-Machine Communication: Complete Volume. Volume 1

    Get PDF
    This is the complete volume of HMC Volume 1

    Acoustic-based Smart Tactile Sensing in Social Robots

    Get PDF
    Mención Internacional en el título de doctorEl sentido del tacto es un componente crucial de la interacción social humana y es único entre los cinco sentidos. Como único sentido proximal, el tacto requiere un contacto físico cercano o directo para registrar la información. Este hecho convierte al tacto en una modalidad de interacción llena de posibilidades en cuanto a comunicación social. A través del tacto, podemos conocer la intención de la otra persona y comunicar emociones. De esta idea surge el concepto de social touch o tacto social como el acto de tocar a otra persona en un contexto social. Puede servir para diversos fines, como saludar, mostrar afecto, persuadir y regular el bienestar emocional y físico. Recientemente, el número de personas que interactúan con sistemas y agentes artificiales ha aumentado, principalmente debido al auge de los dispositivos tecnológicos, como los smartphones o los altavoces inteligentes. A pesar del auge de estos dispositivos, sus capacidades de interacción son limitadas. Para paliar este problema, los recientes avances en robótica social han mejorado las posibilidades de interacción para que los agentes funcionen de forma más fluida y sean más útiles. En este sentido, los robots sociales están diseñados para facilitar interacciones naturales entre humanos y agentes artificiales. El sentido del tacto en este contexto se revela como un vehículo natural que puede mejorar la Human-Robot Interaction (HRI) debido a su relevancia comunicativa en entornos sociales. Además de esto, para un robot social, la relación entre el tacto social y su aspecto es directa, al disponer de un cuerpo físico para aplicar o recibir toques. Desde un punto de vista técnico, los sistemas de detección táctil han sido objeto recientemente de nuevas investigaciones, sobre todo dedicado a comprender este sentido para crear sistemas inteligentes que puedan mejorar la vida de las personas. En este punto, los robots sociales se han convertido en dispositivos muy populares que incluyen tecnologías para la detección táctil. Esto está motivado por el hecho de que un robot puede esperada o inesperadamente tener contacto físico con una persona, lo que puede mejorar o interferir en la ejecución de sus comportamientos. Por tanto, el sentido del tacto se antoja necesario para el desarrollo de aplicaciones robóticas. Algunos métodos incluyen el reconocimiento de gestos táctiles, aunque a menudo exigen importantes despliegues de hardware que requieren de múltiples sensores. Además, la fiabilidad de estas tecnologías de detección es limitada, ya que la mayoría de ellas siguen teniendo problemas tales como falsos positivos o tasas de reconocimiento bajas. La detección acústica, en este sentido, puede proporcionar un conjunto de características capaces de paliar las deficiencias anteriores. A pesar de que se trata de una tecnología utilizada en diversos campos de investigación, aún no se ha integrado en la interacción táctil entre humanos y robots. Por ello, en este trabajo proponemos el sistema Acoustic Touch Recognition (ATR), un sistema inteligente de detección táctil (smart tactile sensing system) basado en la detección acústica y diseñado para mejorar la interacción social humano-robot. Nuestro sistema está desarrollado para clasificar gestos táctiles y localizar su origen. Además de esto, se ha integrado en plataformas robóticas sociales y se ha probado en aplicaciones reales con éxito. Nuestra propuesta se ha enfocado desde dos puntos de vista: uno técnico y otro relacionado con el tacto social. Por un lado, la propuesta tiene una motivación técnica centrada en conseguir un sistema táctil rentable, modular y portátil. Para ello, en este trabajo se ha explorado el campo de las tecnologías de detección táctil, los sistemas inteligentes de detección táctil y su aplicación en HRI. Por otro lado, parte de la investigación se centra en el impacto afectivo del tacto social durante la interacción humano-robot, lo que ha dado lugar a dos estudios que exploran esta idea.The sense of touch is a crucial component of human social interaction and is unique among the five senses. As the only proximal sense, touch requires close or direct physical contact to register information. This fact makes touch an interaction modality full of possibilities regarding social communication. Through touch, we are able to ascertain the other person’s intention and communicate emotions. From this idea emerges the concept of social touch as the act of touching another person in a social context. It can serve various purposes, such as greeting, showing affection, persuasion, and regulating emotional and physical well-being. Recently, the number of people interacting with artificial systems and agents has increased, mainly due to the rise of technological devices, such as smartphones or smart speakers. Still, these devices are limited in their interaction capabilities. To deal with this issue, recent developments in social robotics have improved the interaction possibilities to make agents more seamless and useful. In this sense, social robots are designed to facilitate natural interactions between humans and artificial agents. In this context, the sense of touch is revealed as a natural interaction vehicle that can improve HRI due to its communicative relevance. Moreover, for a social robot, the relationship between social touch and its embodiment is direct, having a physical body to apply or receive touches. From a technical standpoint, tactile sensing systems have recently been the subject of further research, mostly devoted to comprehending this sense to create intelligent systems that can improve people’s lives. Currently, social robots are popular devices that include technologies for touch sensing. This is motivated by the fact that robots may encounter expected or unexpected physical contact with humans, which can either enhance or interfere with the execution of their behaviours. There is, therefore, a need to detect human touch in robot applications. Some methods even include touch-gesture recognition, although they often require significant hardware deployments primarily that require multiple sensors. Additionally, the dependability of those sensing technologies is constrained because the majority of them still struggle with issues like false positives or poor recognition rates. Acoustic sensing, in this sense, can provide a set of features that can alleviate the aforementioned shortcomings. Even though it is a technology that has been utilised in various research fields, it has yet to be integrated into human-robot touch interaction. Therefore, in thiswork,we propose theATRsystem, a smart tactile sensing system based on acoustic sensing designed to improve human-robot social interaction. Our system is developed to classify touch gestures and locate their source. It is also integrated into real social robotic platforms and tested in real-world applications. Our proposal is approached from two standpoints, one technical and the other related to social touch. Firstly, the technical motivation of thiswork centred on achieving a cost-efficient, modular and portable tactile system. For that, we explore the fields of touch sensing technologies, smart tactile sensing systems and their application in HRI. On the other hand, part of the research is centred around the affective impact of touch during human-robot interaction, resulting in two studies exploring this idea.Programa de Doctorado en Ingeniería Eléctrica, Electrónica y Automática por la Universidad Carlos III de MadridPresidente: Pedro Manuel Urbano de Almeida Lima.- Secretaria: María Dolores Blanco Rojas.- Vocal: Antonio Fernández Caballer

    A Sensing Platform to Monitor Sleep Efficiency

    Get PDF
    Sleep plays a fundamental role in the human life. Sleep research is mainly focused on the understanding of the sleep patterns, stages and duration. An accurate sleep monitoring can detect early signs of sleep deprivation and insomnia consequentially implementing mechanisms for preventing and overcoming these problems. Recently, sleep monitoring has been achieved using wearable technologies, able to analyse also the body movements, but old people can encounter some difficulties in using and maintaining these devices. In this paper, we propose an unobtrusive sensing platform able to analyze body movements, infer sleep duration and awakenings occurred along the night, and evaluating the sleep efficiency index. To prove the feasibility of the suggested method we did a pilot trial in which several healthy users have been involved. The sensors were installed within the bed and, on each day, each user was administered with the Groningen Sleep Quality Scale questionnaire to evaluate the user’s perceived sleep quality. Finally, we show potential correlation between a perceived evaluation with an objective index as the sleep efficiency.</p

    Attention and Social Cognition in Virtual Reality:The effect of engagement mode and character eye-gaze

    Get PDF
    Technical developments in virtual humans are manifest in modern character design. Specifically, eye gaze offers a significant aspect of such design. There is need to consider the contribution of participant control of engagement. In the current study, we manipulated participants’ engagement with an interactive virtual reality narrative called Coffee without Words. Participants sat over coffee opposite a character in a virtual café, where they waited for their bus to be repaired. We manipulated character eye-contact with the participant. For half the participants in each condition, the character made no eye-contact for the duration of the story. For the other half, the character responded to participant eye-gaze by making and holding eye contact in return. To explore how participant engagement interacted with this manipulation, half the participants in each condition were instructed to appraise their experience as an artefact (i.e., drawing attention to technical features), while the other half were introduced to the fictional character, the narrative, and the setting as though they were real. This study allowed us to explore the contributions of character features (interactivity through eye-gaze) and cognition (attention/engagement) to the participants’ perception of realism, feelings of presence, time duration, and the extent to which they engaged with the character and represented their mental states (Theory of Mind). Importantly it does so using a highly controlled yet ecologically valid virtual experience
    corecore