207 research outputs found

    Towards responsive Sensitive Artificial Listeners

    Get PDF
    This paper describes work in the recently started project SEMAINE, which aims to build a set of Sensitive Artificial Listeners – conversational agents designed to sustain an interaction with a human user despite limited verbal skills, through robust recognition and generation of non-verbal behaviour in real-time, both when the agent is speaking and listening. We report on data collection and on the design of a system architecture in view of real-time responsiveness

    The communication between low-risk low birth weight premature infants and their mothers in the first year of life a description of four cases

    Get PDF
    Includes bibliographical references.There has been a global increase in survival rates of premature infants due to advances in medical technology. Premature infants are known to be at risk for developmental problems including communication delays and disorders. Speech- Language Pathologists have an important role to play in the assessment and management of premature infants, especially due to the high prevalence of premature births in South Africa. The bonding and attachment experiences of premature infants and their mothers are often challenged, further placing these infants at risk for communication difficulties. This study aimed to explore the communication between low-risk low birth weight premature infants and their mothers at three points in the first year of life. A longitudinal study was conducted where four mother-infant dyads were investigated

    Recognising realistic emotions and affect in speech: State of the art and lessons learnt from the first challenge

    Get PDF
    More than a decade has passed since research on automatic recognition of emotion from speech has become a new field of research in line with its 'big brothers' speech and speaker recognition. This article attempts to provide a short overview on where we are today, how we got there and what this can reveal us on where to go next and how we could arrive there. In a first part, we address the basic phenomenon reflecting the last fifteen years, commenting on databases, modelling and annotation, the unit of analysis and prototypicality. We then shift to automatic processing including discussions on features, classification, robustness, evaluation, and implementation and system integration. From there we go to the first comparative challenge on emotion recognition from speech-the INTERSPEECH 2009 Emotion Challenge, organised by (part of) the authors, including the description of the Challenge's database, Sub-Challenges, participants and their approaches, the winners, and the fusion of results to the actual learnt lessons before we finally address the ever-lasting problems and future promising attempts. (C) 2011 Elsevier B.V. All rights reserved.Schuller B., Batliner A., Steidl S., Seppi D., ''Recognising realistic emotions and affect in speech: state of the art and lessons learnt from the first challenge'', Speech communication, vol. 53, no. 9-10, pp. 1062-1087, November 2011.status: publishe

    Face-to-face interactions and the use of gaze in autism spectrum disorders

    Get PDF
    Study 1, Abstract - Background: Previous findings from computer-based stimuli have indicated a reduced number of fixations towards the eyes, in Autism Spectrum Disorder (ASD). This has been thought to contribute to wider social and emotional difficulties. However, it is unclear whether the reported deficits in gaze can be generalised to real-world interactions. Method: A systematic review was conducted on studies that explored the use of gaze during face-to-face interactions with individuals who have ASD. The search covered the EBSCO, Scopus and Web of Science databases. In total fourteen studies were included: ten contained participants who were children and adolescents, and four studies contained adult participants. Results: The majority of studies found little or no overall difference between ASD and comparison groups in the amount of gaze directed towards an interaction partner’s face. Only one of the included studies found a significantly reduced preference for fixations towards the eyes as compared to other areas of the face. Nevertheless, neuro-typical (NT) participants were found to consistently increase fixation duration towards an interaction partner whilst listening as compared to speaking, such consistency was not found for participants with ASD. Conclusion: The results were discussed in relation to current hypotheses regarding the use of gaze in ASD (e.g. gaze aversion, a lack of automatic motivational process, low social motivation) and whether the lack of group differences was driven by individual differences. Recommendations for future studies are proposed. Study 2, Abstract - Social and emotional difficulties in Autism Spectrum Disorder (ASD) have been linked to differences in the use of social attention as compared to neurotypical (NT) individuals. Much of the evidence for this assertion has stemmed from studies that have used two-dimensional stimuli and eye-tracking (e.g. static images of faces, videos of social scenes). To date, a small number of studies have attempted to investigate the use of gaze by ASD and NT individuals during face-to-face interactions. Using eye-tracking with ten ASD participants and ten NT participants, this study investigated how eye contact was used during a conversation that covered three topics (holidays, preferred mode of transport, and hobbies). In line with recent findings we found that both groups adjusted their total proportion of fixation duration on the eyes depending on whether they were speaking or listening during the interaction. However, the ASD group were found to have an overall lower total fixation duration, made fewer fixations towards the eyes, but were more consistent in their time to make a first fixation on the eyes as compared to the NT group. This study provides a snapshot of how social attention and eye contact is utilised by adults with ASD, offering a number of new avenues for future investigation
    • …
    corecore