309 research outputs found

    Chapter From the Lab to the Real World: Affect Recognition Using Multiple Cues and Modalities

    Get PDF
    Interdisciplinary concept of dissipative soliton is unfolded in connection with ultrafast fibre lasers. The different mode-locking techniques as well as experimental realizations of dissipative soliton fibre lasers are surveyed briefly with an emphasis on their energy scalability. Basic topics of the dissipative soliton theory are elucidated in connection with concepts of energy scalability and stability. It is shown that the parametric space of dissipative soliton has reduced dimension and comparatively simple structure that simplifies the analysis and optimization of ultrafast fibre lasers. The main destabilization scenarios are described and the limits of energy scalability are connected with impact of optical turbulence and stimulated Raman scattering. The fast and slow dynamics of vector dissipative solitons are exposed

    I'll cry instead: the neural correlates of empathy

    Get PDF
    Sarah Krivan studied the communicative functions of adult emotional tears. By analysing participants' neurological activity, she found that tears elicit distinct neural responses that facilitate emotion understanding. Sarah's research shed new light on a uniquely human phenomenon which provides insight into pro-social empathic behaviour in humans

    Melody Informatics: Computational Approaches to Understanding the Relationships Between Human Affective Reasoning and Music

    Get PDF
    Music is a powerful and complex medium that allows people to express their emotions, while enhancing focus and creativity. It is a universal medium that can elicit strong emotion in people, regardless of their gender, age or cultural background. Music is all around us, whether it is in the sound of raindrops, birds chirping, or a popular song played as we walk along an aisle in a supermarket. Music can also significantly help us regain focus while doing a number of different tasks. The relationship between music stimuli and humans has been of particular interest due to music's multifaceted effects on human brain and body. While music can have an anticonvulsant effect on people's bodily signals and act as a therapeutic stimulus, it can also have proconvulsant effects such as triggering epileptic seizures. It is also unclear what types of music can help to improve focus while doing other activities. Although studies have recognised the effects of music in human physiology, research has yet to systematically investigate the effects of different genres of music on human emotion, and how they correlate with their subjective and physiological responses. The research set out in this thesis takes a human-centric computational approach to understanding how human affective (emotional) reasoning is influenced by sensory input, particularly music. Several user studies are designed in order to collect human physiological data while they interact with different stimuli. Physiological signals considered are: electrodermal activity (EDA), blood volume pulse (BVP), skin temperature (ST), pupil dilation (PD), electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS). Several computational approaches, including traditional machine learning approaches with a combination of feature selection methods are proposed which can effectively identify patterns from small to medium scale physiological feature sets. A novel data visualisation approach called "Gingerbread Animation" is proposed, which allows physiological signals to be converted into images that are compatible with transfer learning methods. A novel stacked ensemble based deep learning model is also proposed to analyse large-scale physiological datasets. In the beginning of this research, two user studies were designed to collect physiological signals from people interacting with visual stimuli. The computational models showed high efficacy in detecting people's emotional reactions. The results provided motivation to design a third user study, where these visual stimuli were combined with music stimuli. The results from the study showed decline in recognition accuracy comparing to the previous study. These three studies also gave a key insight that people's physiological response provide a stronger indicator of their emotional state, compared with their verbal statements. Based on the outcomes of the first three user studies, three more user studies were carried out to look into people's physiological responses to music stimuli alone. Three different music genres were investigated: classical, instrumental and pop music. Results from the studies showed that human emotion has a strong correlation with different types of music, and these can be computationally identified using their physiological response. Findings from this research could provide motivation to create advanced wearable technologies such as smartwatches or smart headphones that could provide personalised music recommendation based on an individual's physiological state. The computational approaches can be used to distinguish music based on their positive or negative effect on human mental health. The work can enhance existing music therapy techniques and lead to improvements in various medical and affective computing research

    Affective Computing

    Get PDF
    This book provides an overview of state of the art research in Affective Computing. It presents new ideas, original results and practical experiences in this increasingly important research field. The book consists of 23 chapters categorized into four sections. Since one of the most important means of human communication is facial expression, the first section of this book (Chapters 1 to 7) presents a research on synthesis and recognition of facial expressions. Given that we not only use the face but also body movements to express ourselves, in the second section (Chapters 8 to 11) we present a research on perception and generation of emotional expressions by using full-body motions. The third section of the book (Chapters 12 to 16) presents computational models on emotion, as well as findings from neuroscience research. In the last section of the book (Chapters 17 to 22) we present applications related to affective computing

    Blocking mimicry makes true and false smiles look the same

    Get PDF
    Recent research suggests that facial mimicry underlies accurate interpretation of subtle facial expressions. In three experiments, we manipulated mimicry and tested its role in judgments of the genuineness of true and false smiles. Experiment 1 used facial EMG to show that a new mouthguard technique for blocking mimicry modifies both the amount and the time course of facial reactions. In Experiments 2 and 3, participants rated true and false smiles either while wearing mouthguards or when allowed to freely mimic the smiles with or without additional distraction, namely holding a squeeze ball or wearing a finger-cuff heart rate monitor. Results showed that blocking mimicry compromised the decoding of true and false smiles such that they were judged as equally genuine. Together the experiments highlight the role of facial mimicry in judging subtle meanings of facial expressions

    Emotional expressions reconsidered: challenges to inferring emotion from human facial movements

    Get PDF
    It is commonly assumed that a person’s emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require

    Finding the Hidden: Detecting Atypical Affective States from Physiological Signals

    Get PDF
    In cognitive science, intuition is described as a strategy of processing information that relies on people's instinctive and emotional criteria. When compared with the deliberate choices made after conscious reasoning, the quick and intuitive decision making strategies can be more effective. The intuitive thinking provokes changes in human physiological responses which can be measured by sensors. Utilising physiological reactions, previous work shows that atypical patterns such as emotion expressions and image manipulations can be identified. This thesis expands the exploration to examine whether more atypical human behaviour can be recognised from physiological signals. The examined subtly atypical behaviour includes depression, doubt and deception, Depression is a serious chronic mental disease and is considered as an atypical health condition in people. Doubt is defined as a non-deliberate attempt to mislead others and is a passive form of deception, representing an atypicality from honest behaviours. Deception is a more purposeful attempt to deceive, and thus is a distinct type of atypicality than honest communication. Through examining physiological reactions from presenters who have a particular atypical behaviour or condition, and observers who view behaviours of presenters, this research aims to recognise atypicality in human behaviour. A collection of six user studies are conducted. In two user studies, presenters are asked to conduct doubting and deceiving behaviours, while the remaining user studies involve observers watching behaviours of presenters who suffer from depression, have doubt, or have conducted deception. Physiological reactions of both presenters and observers are collected, including Blood Volume Pulse, Electrodermal Activity, Skin Temperature and Pupillary Responses. Observers are also asked to explicitly evaluate whether the viewed presenters were being depressed, doubting, or deceiving. Investigations upon physiological data in this thesis finds that detectable cues corresponding with depression, doubt and deception can be found. Viewing depression provokes visceral physiological reactions in observers that can be measured. Such physiological responses can be used to derive features for machine learning models to accurately distinguish between healthy individuals and people with depression. By contrast, depression does not provoke strong conscious recognition in observers, resulting in a conscious evaluation accuracy slightly above chance level. Similar results are also found in detecting doubt and deception. People with doubt and deceit elicit consistent physiological reactions within themselves. These bodily responses can be utilised by machine learning models or deep learning models to recognise doubt or deception. The doubt and deceit in presenters can also be recognised using physiological signals in observers, with excellent recognition rates which are higher when compared with the conscious judgments from the same group of observers. The results indicate that atypicality in presenters can both be captured by physiological signals of presenters and observers. Presenters' physiological reactions contribute to higher recognition of atypicality, but observers' physiological responses can serve as a comparable alternative. The awareness of atypicality among observers happens physiologically, so can be used by machine learning models, even when they do not reach the consciousness of the person. The research findings lead to a further discussion around the implications of observers' physiological responses. Decision support applications which utilise a quantifiable measure of people's unconscious and intuitive 'gut feeling' can be developed based on the work reported here to assist people with medical diagnosis, information credibility evaluation, and criminal detection. Further research suggests exploring more atypical behaviours in the wild
    • …
    corecore