11,988 research outputs found

    Imitation as a communication tool for online facial expression learning and recognition

    Full text link
    International audienceWe are interested in understanding how babies learn to recognize facial expressions without having a teaching signal allowing to associate a facial expression to a given abstract label (i.e the name of the facial expression 'sadness', 'happiness'...). Our starting point was a mathematical model showing that if the baby uses a sensory motor architecture for the recognition of the facial expression then the parents must imitate the baby facial expression to allow the on-line learning. In this paper, a first series of robotics experiments showing that a simple neural network model can control the robot head and learn on-line to recognize the facial expressions (the human partner imitates the robot prototypical facial expressions) is presented. We emphasize the importance of the emotions as a mechanism to ensure the dynamical coupling between individuals allowing to learn more complex tasks

    A Review of Verbal and Non-Verbal Human-Robot Interactive Communication

    Get PDF
    In this paper, an overview of human-robot interactive communication is presented, covering verbal as well as non-verbal aspects of human-robot interaction. Following a historical introduction, and motivation towards fluid human-robot communication, ten desiderata are proposed, which provide an organizational axis both of recent as well as of future research on human-robot communication. Then, the ten desiderata are examined in detail, culminating to a unifying discussion, and a forward-looking conclusion

    The Actions and Feelings Questionnaire in Autism and Typically Developed Adults

    Get PDF
    Open access via Springer Compact Agreement We are grateful to Simon Baron-Cohen and Paula Smith of the Cambridge Autism Centre for the use of the ARC database in distributing the questionnaire, to all participants for completing it, to Eilidh Farquar for special efforts in distributing the link and to Gemma Matthews for advice on using AMOS 23. JHGW is supported by the Northwood Trust.Peer reviewedPublisher PD

    An application to improve emotional skills in children with Autism Spectrum Disorder

    Get PDF
    Dissertação de Mestrado Integrado em Engenharia de Eletrónica Industrial e ComputadoresThis dissertation presents a project developed with the aim of promoting emotional skills in children with Autism Spectrum Disorders (ASD). The project involves a serious game and a playware object, which is a physical component that acts as the game controller and allows the user to interactively play the serious game. The playware object has six pressure buttons, each one showing an emoji with a specific facial expression and communicates wirelessly via Bluetooth with the serious game app installed in an Android device. The facial expressions used are: happiness, sadness, fear, anger, surprise and neutral/normal. They were applied to the three game activities (imitation, recognition and storytelling). The chain of tests started with an online questionnaire to validate the avatars created to represent the previously mentioned facial expressions in the game (with 114 answers and a mean success rate of 96.2%), which was followed by a usability test of the application (serious game and playware object) with six typically developing children (with 94.4% answer accuracy). Finally, the three game activities were tested with six children with ASD in three/four sessions. Due to the small group test and the short number of sessions, the goal was to test the acceptance of the game rather than the users´ improvement in the activity. It is worth referring that both the serious game and the playware object had a high level of approval from the children and they expressed their interest during the activities. With this project it was intended to contribute to the development of pedagogical resources to be used by professionals and families in the support of children with ASD.Esta dissertação apresenta um projeto desenvolvido com o objetivo de promover capacidades emocionais em crianças com Perturbação do Espectro do Autismo. Este projeto envolve um jogo sério e um objeto playware, que é um componente físico que funciona como controlador de jogo e permite que o utilizador jogue o jogo sério de uma forma interativa. O objeto playware tem seis botões de pressão, cada um com um emoji com uma expressão facial específica, e comunica sem fios por Bluetooth com a aplicação do jogo sério instalada no dispositivo Android. As expressões faciais usadas são: felicidade, tristeza, medo, raiva, surpresa e neutro/normal. Estas foram aplicadas às três diferentes atividades de jogo (imitar, reconhecer e contar histórias). A cadeia de testes começou com um questionário online para validar os avatares criados para representar as expressões faciais previamente mencionadas no jogo (com 114 submissões e uma taxa média de sucesso de 96,2%), seguido de um teste de usabilidade da aplicação (jogo sério e objeto playware) com seis crianças tipicamente desenvolvidas (com 94,4% de respostas corretas). Por fim, as três atividades de jogo foram testadas com seis crianças com Perturbação do Espectro do Autismo durante 3 a 4 sessões. Devido à pequena dimensão do grupo de teste e ao baixo número de sessões, o objetivo foi testar a aceitação do jogo em vez da evolução das capacidades dos utilizadores na atividade. É importante referir que tanto o jogo sério como o objeto playware tiveram um alto nível de aprovação por parte das crianças que expressaram o seu interesse durante as atividades. Este projeto pretende contribuir para o desenvolvimento de recursos pedagógicos a serem usados por profissionais e famílias no apoio a crianças com Perturbação do Espectro do Autismo

    Being-in-the-world-with: Presence Meets Social And Cognitive Neuroscience

    Get PDF
    In this chapter we will discuss the concepts of “presence” (Inner Presence) and “social presence” (Co-presence) within a cognitive and ecological perspective. Specifically, we claim that the concepts of “presence” and “social presence” are the possible links between self, action, communication and culture. In the first section we will provide a capsule view of Heidegger’s work by examining the two main features of the Heideggerian concept of “being”: spatiality and “being with”. We argue that different visions from social and cognitive sciences – Situated Cognition, Embodied Cognition, Enactive Approach, Situated Simulation, Covert Imitation - and discoveries from neuroscience – Mirror and Canonical Neurons - have many contact points with this view. In particular, these data suggest that our conceptual system dynamically produces contextualized representations (simulations) that support grounded action in different situations. This is allowed by a common coding – the motor code – shared by perception, action and concepts. This common coding also allows the subject for natively recognizing actions done by other selves within the phenomenological contents. In this picture we argue that the role of presence and social presence is to allow the process of self-identification through the separation between “self” and “other,” and between “internal” and “external”. Finally, implications of this position for communication and media studies are discussed by way of conclusion

    Beta event-related desynchronization as an index of individual differences in processing human facial expression: further investigations of autistic traits in typically developing adults

    Get PDF
    The human mirror neuron system (hMNS) has been associated with various forms of social cognition and affective processing including vicarious experience. It has also been proposed that a faulty hMNS may underlie some of the deficits seen in the autism spectrum disorders (ASDs). In the present study we set out to investigate whether emotional facial expressions could modulate a putative EEG index of hMNS activation (mu suppression) and if so, would this differ according to the individual level of autistic traits [high versus low Autism Spectrum Quotient (AQ) score]. Participants were presented with 3 s films of actors opening and closing their hands (classic hMNS mu-suppression protocol) while simultaneously wearing happy, angry, or neutral expressions. Mu-suppression was measured in the alpha and low beta bands. The low AQ group displayed greater low beta event-related desynchronization (ERD) to both angry and neutral expressions. The high AQ group displayed greater low beta ERD to angry than to happy expressions. There was also significantly more low beta ERD to happy faces for the low than for the high AQ group. In conclusion, an interesting interaction between AQ group and emotional expression revealed that hMNS activation can be modulated by emotional facial expressions and that this is differentiated according to individual differences in the level of autistic traits. The EEG index of hMNS activation (mu suppression) seems to be a sensitive measure of the variability in facial processing in typically developing individuals with high and low self-reported traits of autism

    The perception of emotion in artificial agents

    Get PDF
    Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents
    corecore